How proposed changes to higher education accreditation could impact campus diversity efforts

Source: The Conversation – USA (2) – By Jimmy Aguilar, PhD Candidate in Urban Education Policy, University of Southern California

An executive order seeks to remove ‘discriminatory ideology’ in universities. Critics contend it politicizes the accreditation process. Abraham Gonzalez Fernandez via Getty Images

President Donald Trump on April 23, 2025, signed an executive order that aims to change the higher education accreditation process. It asks accrediting agencies to root out “discriminatory ideology” and roll back diversity, equity and inclusion initiatives on college campuses.

The Conversation asked Jimmy Aguilar, who studies higher education at the University of Southern California, to explain what accreditation is, why it matters and how the Trump order seeks to change it.

What is accreditation and how does it work?

Accreditation is a process that evaluates whether colleges and universities meet standards of academic rigor, institutional integrity and financial stability.

In the United States, there were 88 accrediting agencies during the 2022-23 school academic year.

The agencies are formally recognized by the Department of Education and the Council for Higher Education Accreditation.

Accreditation is not a one-time stamp of approval, but a continuous process.

At its core, accreditation is a guarantor of quality in higher education.

The process involves self-assessment and peer review visits.

Colleges typically undergo a full review every five to 10 years, depending on the accrediting agency.

Institutions must meet standards for curriculum, faculty, student services and outcomes, and provide documentation.

Then, federally recognized accrediting agencies review the documentation.

Teams, often comprised of peer reviewers from other colleges, conduct campus visits and evaluations before granting or reviewing accreditation.

Why do universities need to be accredited?

Accreditation assures students, employers and the public that an institution meets basic academic standards.

It also signals credibility and secures federal financial support.

Without it, colleges cannot access key funding sources such as Pell Grants and federal student loans.

The funding is essential for college budgets and students’ access to higher education.

Accreditation is also required for professional licensure in fields such as teaching, nursing, medicine and law.

It also helps ensure that students can transfer credits between institutions.

What does Trump’s executive order do?

President Donald Trump wearing a blue suit and red tie displays a signed executive order.
President Donald Trump displays a signed executive order in the Oval Office at the White House on April 23, 2025, in Washington.
Chip Somodevilla/Getty Images)

The executive order would reshape the college accreditation system, aligning it with the administration’s political priorities. Those priorities include the rollback of DEI initiatives.

The order seeks to use federal oversight to weaken institutional DEI policies and priorities. It also promotes new standards aligned with the administration’s interpretation of “merit-based” education.

The executive order also directs the Department of Education to penalize agencies that require colleges to implement DEI-related standards.

The Trump administration claims that such standards amount to “unlawful discrimination.”

Penalties may include increased oversight or loss of federal recognition. This would render the accreditation seal meaningless, according to the executive order.

The order also proposes a broad overhaul of the accreditation process, including:

  • Promoting “intellectual diversity” in faculty hiring. The executive order argues that promoting a broader range of viewpoints among faculty will enhance academic freedom. Critics often interpret this language as an effort to increase conservative ideological representation.

  • Streamlining the process for institutions to switch accreditors. During Trump’s first term, his administration removed geographic restrictions, giving colleges more flexibility to choose. The new executive order goes further. It makes it easier for schools to leave agencies whose standards they disagree with.

  • Expanding recognition of new accrediting agencies to increase competition.

  • Linking accreditation more directly to student outcomes. This would shift focus to metrics such as graduation rates and earnings, rather than commitments to diversity or equity.

View from front steps of US Supreme Court
A 2023 Supreme Court ruling that outlawed affirmative action in university admissions has been a point of contention in the debate over diversity, equity and inclusion in higher education.
Joe Daniel Price/Getty Images

The executive order singles out accreditors for law schools, such as the American Bar Association, and for medical schools, such as the Liaison Committee on Medical Education.

The order accuses them of enforcing DEI standards that conflict with a 2023 Supreme Court ruling that outlawed affirmative action in university admissions.

However, the ruling was limited to race-conscious admissions. It did not directly address faculty hiring or accreditation standards.

That raises questions about whether the order’s interpretation extends beyond the scope of the court’s decision.

The ruling has nonetheless been a point of contention in the debate over diversity, equity and inclusion.

The American Association of University Professors and the Lawyers’ Committee for Civil Rights Under Law have denounced the executive order.

The groups argue that it threatens to politicize accreditation and suppress efforts to promote equity and inclusion.

Nevertheless, the order represents a push by the federal government to influence higher education governance.

The Conversation

Jimmy Aguilar does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How proposed changes to higher education accreditation could impact campus diversity efforts – https://theconversation.com/how-proposed-changes-to-higher-education-accreditation-could-impact-campus-diversity-efforts-255309

Why the traditional college major may be holding students back in a rapidly changing job market

Source: The Conversation – USA (2) – By John Weigand, Professor Emeritus of Architecture and Interior Design, Miami University

Rethinking the college major could help colleges better understand what employers and students need. Westend61/Getty Images

Colleges and universities are struggling to stay afloat.

The reasons are numerous: declining numbers of college-age students in much of the country, rising tuition at public institutions as state funding shrinks, and a growing skepticism about the value of a college degree.

Pressure is mounting to cut costs by reducing the time it takes to earn a degree from four years to three.

Students, parents and legislators increasingly prioritize return on investment and degrees that are more likely to lead to gainful employment. This has boosted enrollment in professional programs while reducing interest in traditional liberal arts and humanities majors, creating a supply-demand imbalance.

The result has been increasing financial pressure and an unprecedented number of closures and mergers, to date mostly among smaller liberal arts colleges.

To survive, institutions are scrambling to align curriculum with market demand. And they’re defaulting to the traditional college major to do so.

The college major, developed and delivered by disciplinary experts within siloed departments, continues to be the primary benchmark for academic quality and institutional performance.

This structure likely works well for professional majors governed by accreditation or licensure, or more tightly aligned with employment. But in today’s evolving landscape, reliance on the discipline-specific major may not always serve students or institutions well.

As a professor emeritus and former college administrator and dean, I argue that the college major may no longer be able to keep up with the combinations of skills that cross multiple academic disciplines and career readiness skills demanded by employers, or the flexibility students need to best position themselves for the workplace.

Students want flexibility

A man wearing headphones checks his phone while working on a laptop.
The college curriculum may be less flexible now than ever.
MoMo Productions/Digital Vision via Getty Images

I see students arrive on campus each year with different interests, passions and talents – eager to stitch them into meaningful lives and careers.

A more flexible curriculum is linked to student success, and students now consult AI tools such as ChatGPT to figure out course combinations that best position them for their future. They want flexibility, choice and time to redirect their studies if needed.

And yet, the moment students arrive on campus – even before they apply – they’re asked to declare a major from a list of predetermined and prescribed choices. The major, coupled with general education and other college requirements, creates an academic track that is anything but flexible.

Not surprisingly, around 80% of college students switch their majors at least once, suggesting that more flexible degree requirements would allow students to explore and combine diverse areas of interest. And the number of careers, let alone jobs, that college graduates are expected to have will only increase as technological change becomes more disruptive.

As institutions face mounting pressures to attract students and balance budgets, and the college major remains the principal metric for doing so, the curriculum may be less flexible now than ever.

How schools are responding

A student wearing a blue cap and gown stands on grass looking at a building.
The college major emerged as a response to an evolving workforce that prioritized specialized knowledge.
Fuse/Corbia via Getty Images

In response to market pressures, colleges are adding new high-demand majors at a record pace. Between 2002 and 2022, the number of degree programs nationwide increased by nearly 23,000, or 40%, while enrollment grew only 8%. Some of these majors, such as cybersecurity, fashion business or entertainment design, arguably connect disciplines rather than stand out as distinct. Thus, these new majors siphon enrollment from lower-demand programs within the institution and compete with similar new majors at competitor schools.

At the same time, traditional arts and humanities majors are adding professional courses to attract students and improve employability. Yet, this adds credit hours to the degree while often duplicating content already available in other departments.

Importantly, while new programs are added, few are removed. The challenge lies in faculty tenure and governance, along with a traditional understanding that faculty set the curriculum as disciplinary experts. This makes it difficult to close or revise low-demand majors and shift resources to growth areas.

The result is a proliferation of under-enrolled programs, canceled courses and stretched resources – leading to reduced program quality and declining faculty morale.

Ironically, under the pressure of declining demand, there can be perverse incentives to grow credit hours required in a major or in general education requirements as a way of garnering more resources or adding courses aligned with faculty interests. All of which continues to expand the curriculum and stress available resources.

Universities are also wrestling with the idea of liberal education and how to package the general education requirement.

Although liberal education is increasingly under fire, employers and students still value it.

Students’ career readiness skills – their ability to think critically and creatively, to collaborate effectively and to communicate well – remain strong predictors of future success in the workplace and in life.

Reenvisioning the college major

Assuming the requirement for students to complete a major in order to earn a degree, colleges can also allow students to bundle smaller modules – such as variable-credit minors, certificates or course sequences – into a customizable, modular major.

This lets students, guided by advisers, assemble a degree that fits their interests and goals while drawing from multiple disciplines. A few project-based courses can tie everything together and provide context.

Such a model wouldn’t undermine existing majors where demand is strong. For others, where demand for the major is declining, a flexible structure would strengthen enrollment, preserve faculty expertise rather than eliminate it, attract a growing number of nontraditional students who bring to campus previously earned credentials, and address the financial bottom line by rightsizing curriculum in alignment with student demand.

One critique of such a flexible major is that it lacks depth of study, but it is precisely the combination of curricular content that gives it depth. Another criticism is that it can’t be effectively marketed to an employer. But a customized major can be clearly named and explained to employers to highlight students’ unique skill sets.

Further, as students increasingly try to fit cocurricular experiences – such as study abroad, internships, undergraduate research or organizational leadership – into their course of study, these can also be approved as modules in a flexible curriculum.

It’s worth noting that while several schools offer interdisciplinary studies majors, these are often overprescribed or don’t grant students access to in-demand courses. For a flexible-degree model to succeed, course sections would need to be available and added or deleted in response to student demand.

Several schools also now offer microcredentials– skill-based courses or course modules that increasingly include courses in the liberal arts. But these typically need to be completed in addition to requirements of the major.

We take the college major for granted.

Yet it’s worth noting that the major is a relatively recent invention.

Before the 20th century, students followed a broad liberal arts curriculum designed to create well-rounded, globally minded citizens. The major emerged as a response to an evolving workforce that prioritized specialized knowledge. But times change – and so can the model.

The Conversation

John Weigand does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why the traditional college major may be holding students back in a rapidly changing job market – https://theconversation.com/why-the-traditional-college-major-may-be-holding-students-back-in-a-rapidly-changing-job-market-258383

At Cannes, decency and dress codes clash with fashion’s red carpet revolution

Source: The Conversation – USA (2) – By Elizabeth Castaldo Lundén, Research Fellow at the School of Cinematic Arts, University of Southern California

Jennifer Lawrence and Robert Pattinson appear on the red carpet prior to the screening of ‘Die, My Love’ at the 78th annual Cannes Film Festival on May 17, 2025. Kristy Sparow/Getty Images

Ahead of the Cannes Film Festival, the spotlight moved from movie stars and directors to the festival’s fashion rules.

Cannes reminded guests to follow the standard black-tie dress code for evening events at the Grand Theatre Lumière – “long dresses and tuxedos” – while highlighting acceptable alternatives, such as cocktail dresses and pantsuits for women, and a black or navy suit with a tie for men.

The real stir, however, came from two additions to the formal guidelines: a ban on nudity “for decency reasons” and a restriction on oversize garments.

The new rules caught many stylists and stars by surprise, with some decrying the move as a regressive attempt to police clothing.

It’s hard not to wonder whether this is part of some broader conservative cultural shift around the world.

But I study the cultural and economic forces behind fashion and media, and I think a lot of the criticism of Cannes is unfounded. To me, the festival isn’t changing its identity. It’s reasserting it.

Red carpet control

Concerns about indecency on the red carpet have appeared before – most notably during the first televised Academy Awards in 1953.

In 1952, the National Association of Radio and Television Broadcasters adopted a censorship code in response to concerns about television’s influence on young audiences. Among its rules for “decency and decorum” were guidelines against revealing clothing, suggestive movements or camera angles that emphasized body parts – all to avoid causing “embarrassment” to the viewers.

Woman holds paper over her head to protect her hair as she walks across a carpet wearing high heels and a short skirt.
Actress Inger Stevens at the 39th Academy Awards in 1967, a year before she was reprimanded for her skimpy attire.
Bettmann/Getty Images

To ensure that no actress would break the decency dress code, the Academy of Motion Picture Arts and Sciences hired acclaimed costume designer Edith Head as a fashion consultant for the show in 1953.

In my book “Fashion on the Red Carpet,” I explain how Head equipped backstage staff with kits to deal with any sartorial emergencies that might arise. That same year, the balcony cameras at the Pantages Theatre accidentally peeked down into the actresses’ cleavage as they walked to the stage. From then on, a supply of tulle – a type of versatile fabric that can easily cover revealing openings that expose too much skin – was kept backstage.

The 1960s posed new challenges. Youth fashion trends clashed with traditional dress codes and television censorship. In 1968, after actress Inger Stevens appeared on the red carpet wearing a mini skirt, the Academy sent a letter reminding attendees of the black-tie – preferably floor-length – dress code. When Barbra Streisand’s Scaasi outfit accidentally turned see-through under the lighting in 1969, Head again warned against “freaky, far-out, unusual fashion” ahead of the 1970 ceremony.

However, in the 1970s, the Oscars eliminated Head’s fashion consultant position. Despite maintaining its black-tie dress code, the absence of a fashion consultant opened the door to some provocative attire, ranging from Cher’s see-through, sheer outfits, to Edy Williams’ provocative, barely-there getups.

Woman wearing leopard-print bikini and leopard-print shawl.
Once the fashion consultant position was eliminated for the Oscars, many attendees – like actress Edy Williams – tried to stand out from the crowd with provocative attire.
Fotos International/Getty Images

Old rules in a new era

Racy red carpet appearances have since become a hallmark of awards shows, particularly in the digital age.

Extravagance and shock are a way for celebrities and brands to stand out amid a glut of social media content, especially as brands increasingly pay a fortune to turn celebrities into walking billboards.

And in an era when red carpet looks are carefully curated ahead of time through partnerships with fashion brands, many celebrities expressed frustration about being unable to sport the outfits they had planned to wear at Cannes.

Stylist Rose Forde lamented the restrictions, saying, “You should be able to express yourself as an artist, with your style however you feel,” while actress Chloë Sevigny described the code as “an old-fashioned archaic rule.”

But I still can’t see the Cannes rules as part of any sort of broader conservative backlash.

Whether at the Oscars or the MTV Video Music Awards, backlash over celebrities baring too much skin has gone on for decades. Cannes hasn’t been spared from controversy, either: There was Michelle Morgan’s bikini in 1946, La Cicciolina’s topless look in 1988, Madonna’s Jean Paul Gaultier lingerie in 1991, Leila Depina’s barely-there pearl outfit in 2023 and Bella Hadid’s sheer pantyhose dress in 2024, to name just a few.

Young woman with curly hair and a skimpy beaded dress poses on the red carpet in front of a crowd of photographers.
Cape Verdean model Leila Depina arrives for the screening of the film ‘Asteroid City’ during the 2023 Cannes Film Festival.
Christophe Simon/AFP via Getty Images

The festival has routinely reminded guests of its dress code, regardless of the cultural zeitgeist.

The “decency” rule, for example, is actually required by French law. Article 222-32 of the French Criminal Code classifies showing private parts in public as a sexual offense, and can lead to a year in prison and a fine. While the legal definition hinges on intent and setting, the festival, as a public event, technically has to operate within that framework.

Compared to white-tie events like the Nobel Prize ceremony or a state banquet, Cannes’ black-tie requirement is relatively flexible. It allows for cocktail-length dresses and even accommodates pants and flat sandals for women.

Meanwhile, the worry about voluminous clothes points to a practical issue: the movement of bodies in tight spaces.

Unlike the Met Gala – where the fashion spectacle is the focus, and its red carpet is a stage for photo-ops – Cannes is a film festival. The red carpet is the main path thousands of people use to enter the theater.

A dramatic gown – like the one worn at the Met Gala by Cardi B in 2024 – could block others and cause delays. While a photo-op may be the primary goal for celebrities and the brands they promote, the festival has a screening schedule to stick to, and attendees must be able to easily access the venue and their seats.

Red carpet rules are fluid. Sometimes they adapt to cultural shifts. Sometimes they resist them. And sometimes, they’re there to make sure you can fit in your seat in the movie theater.

The Conversation

Elizabeth Castaldo Lundén received funding from Fulbright (2023-2024)

ref. At Cannes, decency and dress codes clash with fashion’s red carpet revolution – https://theconversation.com/at-cannes-decency-and-dress-codes-clash-with-fashions-red-carpet-revolution-256948

From the marriage contract to breaking the glass under the chuppah, many Jewish couples adapt their weddings to celebrate gender equality

Source: The Conversation – USA (3) – By Samira Mehta, Associate Professor of Women and Gender Studies & Jewish Studies, University of Colorado Boulder

The ketubah is a binding document in Jewish law that traditionally spells out a groom’s responsibilities toward his wife − but that many couples adapt to be more egalitarian. PowerSiege/iStock via Getty Images Plus

Traditional Jewish weddings share one key aspect with traditional Christian weddings. Historically, the ceremony was essentially a transfer of property: A woman went from being the responsibility of her father to being the responsibility of her husband.

That may not be the first thing Americans associate with weddings today, but it lives on in rituals and vows. Think, in a traditional Christian wedding, of a bride promising “to obey” her husband, or being “given away” by her father after he walks her down the aisle.

Feminism has changed some aspects of the Christian wedding. More egalitarian or feminist couples, for example, might have the bride be “given away” by both her parents, or have both the bride and groom escorted in by parents. Others skip the “giving” altogether. Queer couples, too, have reimagined the wedding ceremony.

Two women wearing white clothes and prayer shawls dance under a simple canopy in a park as a few people look on.
Mara Mooiweer, left, and Elisheva Dan dance during their socially distanced wedding in Brookline, Mass., during the COVID-19 pandemic.
Jessica Rinaldi/The Boston Globe via Getty Images

During research for my book “Beyond Chrismukkah,” about Christian-Jewish interfaith families, many interviewees wound up talking about their weddings and the rituals that they selected or innovated for the day to reflect their cultural background. Some of them had also designed their ceremonies to reflect feminism and marriage equality – something that the interfaith weddings had in common with many weddings where both members of the couple were Jewish.

These values have transformed many Jewish couples’ weddings, just as they have transformed the Christian wedding. Some Jewish couples make many changes, while some make none. And like every faith, Judaism has lots of internal diversity – not all traditional Jewish weddings look the same.

Contracts and covenants

Perhaps one of the most important places where feminism and marriage equality have reshaped traditions is in the “ketubah,” or Jewish marriage contract.

A traditional ketubah is a simple legal document in Hebrew or Aramaic, a related ancient language. Two witnesses sign the agreement, which states that the groom has acquired the bride. However, the ketubah is also sometimes framed as a tool to protect women. The document stipulates the husband’s responsibility to provide for his wife and confirms what he should pay her in case of divorce. Traditional ketubot – the plural of ketubah – did not discuss love, God or intentions for the marriage.

A man in a blue-gray suit signs a colorfully decorated piece of paper as another man in a white shirt watches.
A groom signs the ketubah as witnesses sit beside him in Jerusalem, Israel, in 2014.
Dan Porges/Getty Images

Contemporary ketubot in more liberal branches of Judaism, whether between opposite- or same-sex couples, are usually much more egalitarian documents that reflect the home and the marriage that the couple want to create. Sometimes the couple adapt the Aramaic text; others keep the Aramaic and pair it with a text in the language they speak every day, describing their intentions for their marriage.

Rather than being simple, printed documents, contemporary ketubot are often beautiful pieces of art, made to hang in a place of prominence in the newlyweds’ home. Sometimes the art makes references to traditional Jewish symbols, such as a pomegranate for fertility and love. Other times, the artist works with the couple to personalize their decorations with images and symbols that are meaningful to them.

Contemporary couples will often also use their ketubah to address an inherent tension in Jewish marriage. Jewish law gives men much more freedom to divorce than it gives women. Because women cannot generally initiate divorce, they can end up as “agunot,” which literally means “chained”: women whose husbands have refused to grant them a religious divorce. Even if the couple have been divorced in secular court, an “agunah” cannot, according to Jewish law, remarry in a religious ceremony.

Contemporary ketubot will sometimes make a note that, while the couple hope to remain married until death, if the marriage deteriorates, the husband agrees to grant a divorce if certain conditions are met. This prevents women from being held hostage in unhappy marriages.

Other couples eschew the ketubah altogether in favor of a new type of document called a “brit ahuvim,” or covenant of lovers. These documents are egalitarian agreements between couples. The brit ahuvim was developed by Rachel Adler, a feminist rabbi with a deep knowledge of Jewish law, and is grounded in ancient Jewish laws for business partnerships between equals. That said, many Jews, including some feminists, do not see the brit ahuvim as equal in status to a ketubah.

A colorful, framed drawing on a white wall, with two older women barely visible sitting on a couch at the back of the room.
Two female ducks are depicted on the ketubah hanging in the sunroom in Lennie Gerber and Pearl Berlin’s home in High Point, N.C.
AP Photo/Allen G. Breed

Building together

Beyond the ketubah, there are any number of other changes that couples make to symbolize their hopes for an egalitarian marriage.

Jewish ceremonies often take place under a canopy called the chuppah, which symbolizes the home that the couple create together. In a traditional Jewish wedding, the bride circles the groom three or seven times before entering the chuppah. This represents both her protection of their home and that the groom is now her priority.

Many couples today omit this custom, because they feel it makes the bride subservient to the groom. Others keep the circling but reinterpret it: In circling the groom, the bride actively creates their home, an act of empowerment. Other egalitarian couples, regardless of their genders, share the act of circling: Each spouse circles three times, and then the pair circle once together.

In traditional Jewish weddings, like in traditional Christian weddings, the groom gives his bride a ring to symbolize his commitment to her – and perhaps to mark her as a married woman. Many contemporary Jewish couples exchange two rings: both partners offering a gift to mark their marriage and presenting a symbol of their union to the world. While some see this shift as an adaptation to American culture, realistically, the dual-ring ceremony is a relatively new development in both American Christian and American Jewish marriage ceremonies.

Finally, Jewish weddings traditionally end when the groom stomps on and breaks a glass, and the entire crowd yells “Mazel tov” to congratulate them. People debate the symbolism of the broken glass. Some say that it reminds us that life contains both joy and sorrow, or that it is a reminder of a foundational crisis in Jewish history: the destruction of the Second Temple in Jerusalem in 70 C.E. Others say that it is a reminder that life is fragile, or that marriage, unlike the glass, is an unbreakable covenant.

A man and woman, both wearing white, smile as they raise their joined hands above their heads.
Yulia Tagil and Stas Granin celebrate their union on July 25, 2010, at a square in Tel Aviv. The couple held a public wedding to protest Israeli marriage guidelines set by the chief rabbinate.
Uriel Sinai/Getty Images

Regardless of what it means, some contemporary couples both step on glasses, or have one partner place their foot on top of the other’s so that the newlyweds can break the glass together. The couple symbolize their commitment to equality – and both get to do a fun wedding custom.

There are many other innovations in contemporary Jewish weddings that have much less to do with feminism and egalitarianism, such as personalized wedding canopies or wedding programs. But these key changes represent how the wedding ceremony itself has become more egalitarian in response to both feminism and marriage equality.

The Conversation

Samira Mehta receives funding from the Henry Luce Foundation for work on Jews of Color.

ref. From the marriage contract to breaking the glass under the chuppah, many Jewish couples adapt their weddings to celebrate gender equality – https://theconversation.com/from-the-marriage-contract-to-breaking-the-glass-under-the-chuppah-many-jewish-couples-adapt-their-weddings-to-celebrate-gender-equality-229084

Where tomorrow’s scientists prefer to live − and where they’d rather not

Source: The Conversation – USA (2) – By Christopher P. Scheitle, Associate Professor of Sociology, West Virginia University

Many students have strong feelings about where they want to move after graduation. Tony Garcia/Stone via Getty Images

Graduate students interested in an academic career after graduation day have often been told they need to be open to moving somewhere they may not want to live. This advice is because of how hard it is to get a tenure-track professor position.

These days, this advice may be less relevant as graduate students are increasingly pursuing and ending up in careers outside of academia.

Where graduate students want to settle post-graduation has potential consequences for communities and states across the country that depend more and more on a steady stream of skilled workers to power their economies. Locations seen as undesirable may struggle to attract and retain the next generation of scientists, engineers, professors and other professions filled by today’s graduate students.

We are sociologists who are examining some of the factors that influence graduate students’ educational and career paths as part of a research project supported by the National Science Foundation. In March 2025 we distributed a survey to a sample of U.S.-based graduate students in five natural and social science disciplines: physics, chemistry, biology, psychology and sociology.

As part of our survey, we asked students to identify states they would prefer to live in and places where they would be unwilling to go. To some extent, our findings match some past anecdotes and evidence about the varying number of applications received for academic positions across different states or regions.

But little data has directly assessed students’ preferences, and our survey also provides some evidence that some states’ policies are having a negative impact on their ability to attract highly educated people.

Most preferred, most unwilling

For our study, we built our sample from the top 60 graduate programs for each of the five disciplines based on rankings from U.S. News and World Report. We received responses from nearly 2,000 students. Almost all of these students – 98%, specifically – are pursuing Ph.D.s in their respective fields.

As part of our survey, we asked students to identify locations where they would “prefer” to live and also those where they would be “unwilling” to live after finishing their graduate program. For each of these questions, we presented students with a list of all states along with the option of “outside of the United States.”

Just looking at the overall percentages, California tops the list of preferred places, with 49% of all survey-takers stating a preference to live there, followed by New York at 45% and Massachusetts with 41%.

On the other hand, Alabama was selected most often as a state students said they’d be unwilling to move to, with 58% declaring they wouldn’t want to live there. This was followed by Mississippi and Arkansas, both with just above 50% saying they’d be unwilling to move to either state.

Clusters of preference

While the two lists in many respects appear like inversions of one another, there are some exceptions to that. Looking beyond the overall percentages for each survey question, we used statistical analysis to identify underlying groups or clusters of states that are more similar to each other across both the “prefer” and “unwilling” questions.

One cluster, represented by California, New York and Massachusetts, is characterized by a very high level of preference and a low level of unwillingness. About 35% to 50% of students expressed a preference for living in these places, while only 5% to 10% said they would be unwilling to live in them. The response of “outside of the United States” is also in this category, which is noteworthy given recent concerns about the current generation of Ph.D. students looking to leave the country and efforts by other nations to recruit them.

A second cluster represents states where the preference levels are a bit lower, 20% to 30%, and the unwillingness levels are a bit higher, 7% to 15%. Still, these are states for which graduate students hold generally favorable opinions about living in after finishing their programs. This cluster includes states such as Colorado, Illinois, Pennsylvania, Maryland and New Jersey.

A third group of states represents locations for which the rate of preference is similar to the rate of unwillingness, in the range of 10% to 20%. This cluster includes states such as Minnesota, Delaware and Virginia.

The fourth and fifth clusters consist of states where the rate of unwillingness exceeds the rate of preference, with the size of the gap distinguishing the two clusters. In the fourth cluster, at least some students – 5% to 10% – express a preference for living in them, while around 30% to 40% say they are unwilling to live in them. This cluster includes Florida, Montana, South Carolina and Utah.

Almost no students express a preference for living in the states contained in the fifth cluster, while the highest percentages – 40% to 60% – express an unwillingness to live in them. This cluster includes Alabama, Kansas, Oklahoma and South Dakota.

Signs of current politics

Many factors influence our preferences for where we want to live, including family, weather and how urban, rural or suburban it is. The politics of a community can also influence our perceptions of a place’s desirability.

Indeed, political factors may be of particular concern to graduate students. In recent years, some states have taken a more hostile stance toward specific academic disciplines, institutions of higher education in general, or professions that are of interest to graduate students. While states such as Florida and Texas have been leading such efforts, many others have followed.

Interestingly, our statistical grouping of states finds that students’ unwillingness to live in states such as Texas, Florida, Georgia and Ohio is higher than we would expect given those states’ corresponding preference levels. For example, about 10% of students selected Texas as a place they would prefer to live in after graduation. Looking at other states with similar preference levels, we would expect about 10% to 20% of students to say they are unwilling to live in Texas. Instead, this percentage is actually 37%. Similarly, 5% of students say they would prefer to live in Florida. Other states with this preference rate have an unwillingness rate of around 35%, but Florida’s is 45%.

Although our data does not tell us for sure, these gaps could be a function of these states’ own policies or alignment with federal policies seen as hostile to graduate students and their future employers.

These findings suggest that communities and employers in some states might continue to face particularly steep hurdles in recruiting graduate students for employment once they finish their degrees.

The Conversation

Christopher P. Scheitle receives funding from the National Science Foundation and the John Templeton Foundation. This article is based on a study supported by the National Science Foundation (Award #2344563).

Katie Corcoran receives funding from the National Science Foundation, the John Templeton Foundation, and the Patient-Centered Outcomes Research Institute.

Taylor Remsburg receives funding from the National Science Foundation and the John Templeton Foundation as a research assistant. This article is based on a study supported by the National Science Foundation (Award #2344563).

ref. Where tomorrow’s scientists prefer to live − and where they’d rather not – https://theconversation.com/where-tomorrows-scientists-prefer-to-live-and-where-theyd-rather-not-254431

Taking intermittent quizzes reduces achievement gaps and enhances online learning, even in highly distracting environments

Source: The Conversation – USA (2) – By Jason C.K. Chan, Professor of Psychology, Iowa State University

More Americans are learning remotely. Drazen/E+ via Getty Images

Inserting brief quiz questions into an online lecture can boost learning and may reduce racial achievement gaps, even when students are tuning in remotely in a distracting environment.

That’s a main finding of our recent research published in Communications Psychology. With co-authors Dahwi Ahn, Hymnjyot Gill and Karl Szpunar, we present evidence that adding mini-quizzes into an online lecture in science, technology, engineering or mathematics – collectively known as STEM – can boost learning, especially for Black students.

In our study, we included over 700 students from two large public universities and five two-year community colleges across the U.S. and Canada. All the students watched a 20-minute video lecture on a STEM topic. Each lecture was divided into four 5-minute segments, and following each segment, the students either answered four brief quiz questions or viewed four slides reviewing the content they’d just seen.

This procedure was designed to mimic two kinds of instructions: those in which students must answer in-lecture questions and those in which the instructor regularly goes over recently covered content in class.

All students were tested on the lecture content both at the end of the lecture and a day later.

When Black students in our study watched a lecture without intermittent quizzes, they underperformed Asian, white and Latino students by about 17%. This achievement gap was reduced to a statistically nonsignificant 3% when students answered intermittent quiz questions. We believe this is because the intermittent quizzes help students stay engaged with the lecture.

To simulate the real-world environments that students face during online classes, we manipulated distractions by having some participants watch just the lecture; the rest watched the lecture with either distracting memes on the side or with TikTok videos playing next to it.

Surprisingly, the TikTok videos enhanced learning for students who received review slides. They performed about 8% better on the end-of-day tests than those who were not shown any memes or videos, and similar to the students who answered intermittent quiz questions. Our data further showed that this unexpected finding occurred because the TikTok videos encouraged participants to keep watching the lecture.

For educators interested in using these tactics, it is important to know that the intermittent quizzing intervention only works if students must answer the questions. This is different from asking questions in a class and waiting for a volunteer to answer. As many teachers know, most students never answer questions in class. If students’ minds are wandering, the requirement of answering questions at regular intervals brings students’ attention back to the lecture.

This intervention is also different from just giving students breaks during which they engage in other activities, such as doodling, answering brain teaser questions or playing a video game.

Why it matters

Online education has grown dramatically since the pandemic. Between 2004 and 2016, the percentage of college students enrolling in fully online degrees rose from 5% to 10%. But by 2022, that number nearly tripled to 27%.

Relative to in-person classes, online classes are often associated with lower student engagement and higher failure and withdrawal rates.

Research also finds that the racial achievement gaps documented in regular classroom learning are magnified in remote settings, likely due to unequal access to technology.

Our study therefore offers a scalable, cost-effective way for schools to increase the effectiveness of online education for all students.

What’s next?

We are now exploring how to further refine this intervention through experimental work among both university and community college students.

As opposed to observational studies, in which researchers track student behaviors and are subject to confounding and extraneous influences, our randomized-controlled study allows us to ascertain the effectiveness of the in-class intervention.

Our ongoing research examines the optimal timing and frequency of in-lecture quizzes. We want to ensure that very frequent quizzes will not hinder student engagement or learning.

The results of this study may help provide guidance to educators for optimal implementation of in-lecture quizzes.

The Research Brief is a short take on interesting academic work.

The Conversation

Jason C.K. Chan receives funding from the USA National Science Foundation.

Zohara Assadipour does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Taking intermittent quizzes reduces achievement gaps and enhances online learning, even in highly distracting environments – https://theconversation.com/taking-intermittent-quizzes-reduces-achievement-gaps-and-enhances-online-learning-even-in-highly-distracting-environments-254046

Trump’s battle with elite universities overlooks where most students actually go to college

Source: The Conversation – USA (2) – By Amy Li, Associate Professor of Higher Education, Florida International University

There are nearly 20 million undergraduate college students in the United States. Anadolu/Getty Images

Headlines often mention the ongoing power struggle between President Donald Trump’s administration and private colleges such as Columbia University and Harvard University.

But such elite universities educate only a small portion of America’s total undergraduate population, which stood at 20 million in fall 2024.

As an associate professor of higher education, I have published research on policies that affect college access, retention and graduation. My work has examined data across different types of higher education institutions.

The Ivies and other elites

Less than 1% of American college students attend elite private colleges.

A small group of colleges, consisting of Ivy League schools and other highly selective universities known as “Ivy-Plus,” fit in this category.

The Ivy League consists of eight private schools that formed an athletic conference in the 1950s. The member universities are known for their academic excellence.

The Ivy-Plus are highly prestigious colleges located across the country with similar reputations for outstanding academics such as Stanford University, Duke University and the Massachusetts Institute of Technology.

These colleges have extremely competitive admissions, often accepting less than 10% of applicants.

They enroll students from high-income backgrounds more than any other type of institution. Students from upper-income families represent 60% to 70% of attendees at elite privates.

Elite private universities confer undergraduate and graduate degrees and focus on research.

Elite public colleges

Elite public colleges, such as the University of California, Berkeley, and the University of Virginia, are near the top of the U.S. News & World Report’s rankings. They also are often the flagship university in their state, such as the University of Michigan.

These colleges have highly selective admissions processes as well and often accept about 10% to 20% of applicants.

The largest portion of revenue at public universities, roughly 40%, comes from government sources that include federal, state and local government grants, contracts and appropriations, according to the National Center for Education Statistics.

Students from upper-income families constitute 50% to 55% of attendees at elite public colleges.

Like elite private colleges, elite public colleges confer undergraduate and graduate degrees and focus on research.

Community colleges

There are 1,024 community colleges in the U.S., serving 39% of undergraduate students.

These public, two-year colleges grant associate degrees and occasionally bachelor’s degrees. They also offer certificates, workforce training and noncredit courses to prepare students for college-level courses.

Community colleges have a strong teaching focus and a mission to serve their communities. They tend to guarantee admission to anyone who wants to enroll and offer lower tuition and fees.

Community colleges are also critical entry points for students from lower-income households and those who identify as racial or ethnic minorities or who are the first in their family to attend college.

Like other public institutions, community colleges depend heavily on state funding, as well as local property taxes.

Regional universities

Students with backpacks walk on campus during warm weather.
Roughly 70% of undergraduate students who attend public, four-year institutions enroll at regional public universities.
Newsday RM via Getty Images

Of all undergraduates who attend public, four-year institutions, roughly 70% enroll in regional institutions.

They include colleges in state-run systems such as the State University of New York and California State University.

There is wide variation in acceptance rates among regional public universities, but they tend to be moderately selective, accepting between half and 70% of applicants.

Regional public universities offer a wide range of academic programs mostly at the bachelor’s and master’s levels. They also depend heavily on state funding.

Small private colleges

Small, less selective private colleges often have acceptance rates of 60% or higher and enroll 3,000 or fewer students.

Their budgets depend primarily on tuition and fees.

Some of these types of colleges have suffered from enrollment declines since the early 2000s, exacerbated by the COVID-19 pandemic.

Many of these institutions lacked the large endowments that allowed elite privates to weather the financial challenges brought on by the pandemic.

A number of small private colleges, such as Eastern Nazarene College in Massachusetts, have closed or merged with other universities due to financial difficulties.

These small private colleges often offer academic programs at the bachelor’s and master’s levels.

Private for-profit

About 5% of students attend private for-profit colleges.

These colleges offer courses in convenient formats that may be attractive to older adult students, including those with full-time jobs.

For-profit college students disproportionately identify as older, Black and female. Students who attend these colleges are also more likely to be single parents.

In recent years, the federal government has cracked down on false promises some for-profit institutions made about their graduates’ job and earnings prospects and other outcomes.

The enforcement led to the closure of some colleges, such as ITT Technical Institute and Corinthian Colleges.

Minority-serving institutions

Students dressed in graduation regalia stand in rows.
Minority-serving institutions, including historically Black colleges and universities, have a mission to serve certain populations.
Andrew Caballero-Reynolds/AFP via Getty Images

Minority-serving institutions have a mission to serve certain student populations.

Minority-serving institutions include historically Black colleges and universities, or HBCUs, such as Morehouse College; Hispanic-serving institutions, or HSIs, such as Florida International University; Asian American, Native American and Pacific Islander-serving institutions, or AANAPISIs, such as North Seattle College; and tribal colleges and universities, or TCUs, such as Blackfeet Community College, which serve Native American students.

The federal government determines which colleges fit the criteria.

These are primarily two- and four-year colleges, but some grant graduate degrees.

The Conversation

Amy Li does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Trump’s battle with elite universities overlooks where most students actually go to college – https://theconversation.com/trumps-battle-with-elite-universities-overlooks-where-most-students-actually-go-to-college-254680

I’m a business professor who asked dozens of former students how they define success. Here are their lessons for today’s grads

Source: The Conversation – USA (2) – By Patrick Abouchalache, Lecturer in Strategy and Innovation, Boston University

As the Class of 2025 graduates into an uncertain and fast-changing working world, they face a crucial question: What does it mean to be successful?

Is it better to take a job that pays more, or one that’s more prestigious? Should you prioritize advancement, relationship building, community impact or even the opportunity to live somewhere new? Sorting through these questions can feel overwhelming.

I am a business school professor who spends a lot of time mentoring students and alumni in Generation Z – those born between 1997 and 2012. As part of this effort, I’ve surveyed about 300 former undergraduate students and spoken at length with about 50 of them.

Through these conversations, I’ve watched them wrestle with the classic conflicts of young adulthood – such as having to balance external rewards like money against internal motivations like wanting to be of service.

I recently revisited their stories and reflections, and I compiled the most enduring insights to offer to the next generation of graduates.

Here’s their collective advice to the Class of 2025:

1. Define what matters most to you

Success starts with self-reflection. It means setting aside society’s noise and defining your own values.

When people are driven by internal rewards like curiosity, purpose or pleasure in an activity itself – rather than outside benefits such as money – psychologists say they have “intrinsic motivation.”

Research shows that people driven by intrinsic motivation tend to display higher levels of performance, persistence and satisfaction. Harvard Business School professor Teresa Amabile’s componential theory further suggests that creativity flourishes when people’s skills align with their strongest intrinsic interests.

The alternative is to “get caught up in society’s expectations of success,” as one consulting alum put it. She described struggling to choose between a job offer at a Fortune 500 company or one at a lesser-known independent firm. In the end, she chose to go with the smaller business. It was, she stressed, “the right choice for me.” This is crucial advice: Make yourself proud, not others.

One related principle I share with students is the “Tell your story” rule. If a job doesn’t allow you to tell your story – in other words, if it doesn’t mirror your vision, values, talents and goals – keep looking for a new role.

2. Strive for balance, not burnout

A fulfilling life includes time for relationships, health and rest. While many young professionals feel endless pressure to hustle, the most fulfilled alumni I spoke with learned to take steps to protect their personal well-being.

For example, a banking alum told me that business once dominated his thoughts “24/7.” He continued, “I’m happier now that I make more time for a social life and paying attention to all my relationships – professional, personal, community, and let’s not forget myself.”

And remember that balance and motivations can change throughout your life. As one alum explained: “Your goals change and therefore your definition of success changes. I think some of the most successful people are always adapting what success means to them – chasing success even if they are already successful.”

3. Be kind, serve others and maximize your ‘happy circle’

“Some people believe to have a positive change in the world you must be a CEO or have a ton of money,” another alum told me. “But spreading happiness or joy can happen at any moment, has no cost, and the results are priceless.”

Many alumni told me that success isn’t just a matter of personal achievement – it’s about giving back to society. That could be through acts of kindness, creativity, innovation, or other ways of improving people’s lives. A retail alum shared advice from her father: “When your circle is happy, you are going to be happy,” she said. “It’s sort of an upward spiral.”

Your “happy circle” doesn’t need to consist of people you know. An alum who went into the pharmaceutical industry said his work’s true reward was measured in “tens of thousands if not millions of people” in better health thanks to his efforts.

In fact, your happy circle doesn’t even need to be exclusively human. An alum who works in ranching said he valued the well-being of animals – and their riders – more than money or praise.

4. Be a good long-term steward of your values

Success isn’t just about today – it’s what you stand for.

Several alumni spoke passionately about stewardship: the act of preserving and passing on values, relationships and traditions. This mindset extended beyond family to employees, customers and communities. As one alum who majored in economics put it, success is “leaving a mark on the world and creating a legacy that extends beyond one’s quest for monetary gain.”

One alum defined success as creating happiness and stability not just for herself, but for her loved ones. Another, who works in hospitality, said he had a duty to further his employees’ ambitions and help them grow and develop – creating a legacy that will outlast any title or paycheck.

In an analysis by the organizational consulting firm Korn Ferry, Gen Z employees were found to be more prone to burnout when their employers lacked clear values. These findings reinforce what my students already know: Alignment between your values and your work is key to success.

Final words for the Class of 2025

To the latest crop of grads, I offer this advice: Wherever life takes you next — a family business or corporate office, Wall Street or Silicon Valley, or somewhere you can’t even imagine now — remember that your career will be long and full of ups and downs.

You’ll make tough choices. You’ll face pressures. But if you stay grounded, invest in your well-being, celebrate your happy circle and honor your values, you’ll look back one day and see not just a job well done, but a life well lived.

Bon voyage!

The Conversation

Patrick Abouchalache does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. I’m a business professor who asked dozens of former students how they define success. Here are their lessons for today’s grads – https://theconversation.com/im-a-business-professor-who-asked-dozens-of-former-students-how-they-define-success-here-are-their-lessons-for-todays-grads-256189

When does a kid become an adult?

Source: The Conversation – USA (2) – By Jonathan B. Santo, Professor of Psychology, University of Nebraska Omaha

They might not be grown-ups yet. Klaus Vedfelt/DigitalVision via Getty Images

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to CuriousKidsUS@theconversation.com.


When does a kid become an adult? – Avery, age 8, Los Angeles


Not everyone grows up at the same pace, even though U.S. law holds that you reach adulthood when you turn 18. This is the age where you are treated like an adult in terms of criminal responsibility. However, states differ on the “civil age of majority,” which means that you don’t necessarily get all the rights and privileges reserved for grown-ups at that point.

For example, U.S. citizens may vote or get a tattoo without their parents’ consent when they’re 18, but they can’t legally buy or consume alcohol until their 21st birthday. Young Americans are subject to extra restrictions and fees if they want to rent a car before they’re 25 – even if they got a driver’s license when they turned 16 and have been earning a living for years.

Even physical signs of maturity don’t provide an easy answer to this question. Puberty brings about physical changes associated with adulthood like facial hair or breast development. It also marks the onset of sexual maturity – being able to have children.

Those changes don’t happen at the same time for everyone.

For example, girls typically start going through puberty and beginning to look like adults at an earlier age than boys. Some people don’t look like grown-ups until they’re well into their 20s.

In my view, as a professor of developmental psychology, what really matters in terms of becoming an adult is how people feel and behave, and the responsibilities they handle.

18th Birthday cake with fruit and chocolate.
Even if you’ve developed a sophisticated palate by the time you turn 18, you still aren’t necessarily a full-fledged adult.
nedomacki/Getty Images

Age at milestones may vary

Because everybody is unique, there’s no standard timeline for growing up. Some people learn how to control their emotions, develop the judgment to make good decisions and manage to earn enough to support themselves by the age of 18.

Others take longer.

Coming of age also varies due to cultural differences. In some families, it’s expected that you’ll remain financially dependent on your parents until your mid-20s as you get a college education or job training.

Even within one family, your personality, experiences, career path and specific circumstances can influence how soon you’d be expected to shoulder adult responsibilities.

A young blonde woman stands while her photo is taken.
Drew Barrymore attends a movie premiere at the age of 15 – one year after a judge declared her to be an adult in the eyes of the law through emancipation.
Ron Galella, Ltd. via GettyImages

Some young people technically enter adulthood before they turn 18 through a process called “emancipation” – a legal status indicating that a young person is responsible for their own financial affairs and medical obligations.

Economic independence is hard to attain for young teens, however, because child labor is restricted and regulated in the U.S. by federal law, with states setting some of these rules. States also determine how old you have to be to get married. In most states, that’s 18 years old. But some states allow marriage at any age.

Differentiating between kids and adults

Understanding the differences between how children and adults think can help explain when a kid becomes an adult.

For example, children tend to think concretely and may struggle more than adults with abstract concepts like justice or hypothetical scenarios.

Kids and teens also have shorter attention spans than adults and are more easily distracted, whereas adults are generally better at filtering out distractions.

What’s more, children, especially little ones, tend to have more trouble controlling their emotions. They’re more prone to crying or screaming when they are frustrated or upset than adults.

One reason why being fully grown up by the time you turn 18 or even 21 might not be possible is because of our brains. The prefrontal cortex, which is a part of the brain that plays a crucial role in planning and weighing risks, doesn’t fully develop in most people before their 25th birthday.

Making choices that have lifelong consequences

The delay in the brain’s maturity can make it hard for young adults to fully consider the real-world consequences of their actions and choices. This mismatch may explain why adolescents and people in their early 20s often engage in risky or even reckless behavior – such as driving too fast, not wearing a seatbelt, using dangerous drugs, binge drinking or stealing things.

Despite the medical evidence about the late maturation of the brain, the law doesn’t provide any leeway for whether someone has truly matured if they’re accused of a breaking the law. Once they’re 18 years old, Americans can be tried legally as adults for serious crimes, including murder.

These still-developing parts of the brain also help explain why children are more susceptible to peer pressure. For instance, adolescents are more prone to confess to crimes they didn’t commit under police interrogation, partly because they can’t properly weigh the long-term consequences of their decisions.

However, there are benefits to adolescents’ having a higher tolerance to risks and risk-taking. This can help explain why many young people are motivated to engage in protests regarding climate change and other causes.

Feeling like a real adult

In North America, some young people who by many standards are adults – in that they are over 20 years old, own a car and have a job – may not feel like they’re grown-ups regardless of what the law has to say about it. The psychologist Jeffrey Arnett coined the term “emerging adults” to describe Americans who are 21-25 years old but don’t yet feel like they’re grown-ups.

When someone becomes an adult, regardless of what the law says, really depends on the person.

There are 25-year-olds with full-time jobs and their own children who may still not feel like adults and still rely on their parents for a lot of things grown-ups typically handle. There are 17-year-olds who make all of their own doctor’s appointments, take care of their younger siblings or grandparents, and do all the grocery shopping, meal planning and laundry for their household. They probably see themselves as adults.

Growing up is about gaining experiences, making mistakes and learning from them, while also taking responsibility for your own actions. As there’s no single definition of adulthood, everyone has to decide for themselves whether or not they’ve turned into a grown-up yet.


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Jonathan B. Santo does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. When does a kid become an adult? – https://theconversation.com/when-does-a-kid-become-an-adult-246287

Beyond the backlash: What evidence shows about the economic impact of DEI

Source: The Conversation – USA (2) – By Rodney Coates, Professor of Critical Race and Ethnic Studies, Miami University

DEI has a long history. Nora Carol Photography via Getty Images

Few issues in the U.S. today are as controversial as diversity, equity and inclusion – commonly referred to as DEI.

Although the term didn’t come into common usage until the 21st century, DEI is best understood as the latest stage in a long American project. Its egalitarian principles are seen in America’s founding documents, and its roots lie in landmark 20th-century efforts such as the 1964 Civil Rights Act and affirmative action policies, as well as movements for racial justice, gender equity, disability rights, veterans and immigrants.

These movements sought to expand who gets to participate in economic, educational and civic life. DEI programs, in many ways, are their legacy.

Critics argue that DEI is antidemocratic, that it fosters ideological conformity and that it leads to discriminatory initiatives, which they say disadvantage white people and undermine meritocracy. Those defending DEI argue just the opposite: that it encourages critical thinking and promotes democracy − and that attacks on DEI amount to a retreat from long-standing civil rights law.

Yet missing from much of the debate is a crucial question: What are the tangible costs and benefits of DEI? Who benefits, who doesn’t, and what are the broader effects on society and the economy?

As a sociologist, I believe any productive conversation about DEI should be rooted in evidence, not ideology. So let’s look at the research.

Who gains from DEI?

In the corporate world, DEI initiatives are intended to promote diversity, and research consistently shows that diversity is good for business. Companies with more diverse teams tend to perform better across several key metrics, including revenue, profitability and worker satisfaction.

Businesses with diverse workforces also have an edge in innovation, recruitment and competitiveness, research shows. The general trend holds for many types of diversity, including age, race and ethnicity, and gender.

A focus on diversity can also offer profit opportunities for businesses seeking new markets. Two-thirds of American consumers consider diversity when making their shopping choices, a 2021 survey found. So-called “inclusive consumers” tend to be female, younger and more ethnically and racially diverse. Ignoring their values can be costly: When Target backed away from its DEI efforts, the resulting backlash contributed to a sales decline.

But DEI goes beyond corporate policy. At its core, it’s about expanding access to opportunities for groups historically excluded from full participation in American life. From this broader perspective, many 20th-century reforms can be seen as part of the DEI arc.

Consider higher education. Many elite U.S. universities refused to admit women until well into the 1960s and 1970s. Columbia, the last Ivy League university to go co-ed, started admitting women in 1982. Since the advent of affirmative action, women haven’t just closed the gender gap in higher education – they outpace men in college completion across all racial groups. DEI policies have particularly benefited women, especially white women, by expanding workforce access.

Many Ivy League universities didn’t admit women until surprisingly recently.

Similarly, the push to desegregate American universities was followed by an explosion in the number of Black college students – a number that has increased by 125% since the 1970s, twice the national rate. With college gates open to more people than ever, overall enrollment at U.S. colleges has quadrupled since 1965. While there are many reasons for this, expanding opportunity no doubt plays a role. And a better-educated population has had significant implications for productivity and economic growth.

The 1965 Immigration Act also exemplifies DEI’s impact. It abolished racial and national quotas, enabling the immigration of more diverse populations, including from Asia, Africa, southern and eastern Europe and Latin America. Many of these immigrants were highly educated, and their presence has boosted U.S. productivity and innovation.

Ultimately, the U.S. economy is more profitable and productive as a result of immigrants.

What does DEI cost?

While DEI generates returns for many businesses and institutions, it does come with costs. In 2020, corporate America spent an estimated US$7.5 billion on DEI programs. And in 2023, the federal government spent more than $100 million on DEI, including $38.7 million by the Department of Health and Human Services and another $86.5 million by the Department of Defense.

The government will no doubt be spending less on DEI in 2025. One of President Donald Trump’s first acts in his second term was to sign an executive order banning DEI practices in federal agencies – one of several anti-DEI executive orders currently facing legal challenges. More than 30 states have also introduced or enacted bills to limit or entirely restrict DEI in recent years. Central to many of these policies is the belief that diversity lowers standards, replacing meritocracy with mediocrity.

But a large body of research disputes this claim. For example, a 2023 McKinsey & Company report found that companies with higher levels of gender and ethnic diversity will likely financially outperform those with the least diversity by at least 39%. Similarly, concerns that DEI in science and technology education leads to lowering standards aren’t backed up by scholarship. Instead, scholars are increasingly pointing out that disparities in performance are linked to built-in biases in courses themselves.

That said, legal concerns about DEI are rising. The Equal Employment Opportunity Commission and Department of Justice have recently warned employers that some DEI programs may violate Title VII of the Civil Rights Act of 1964. Anecdotal evidence suggests that reverse discrimination claims, particularly from white men, are increasing, and legal experts expect the Supreme Court to lower the burden of proof needed by complainants for such cases.

The issue remains legally unsettled. But while the cases work their way through the courts, women and people of color will continue to shoulder much of the unpaid volunteer work that powers corporate DEI initiatives. This pattern raises important equity concerns within DEI itself.

What lies ahead for DEI?

People’s fears of DEI are partly rooted in demographic anxiety. Since the U.S. Census Bureau projected in 2008 that non-Hispanic white people would become a minority in the U.S by the year 2042, nationwide news coverage has amplified white fears of displacement.

Research indicates many white men experience this change as a crisis of identity and masculinity, particularly amid economic shifts such as the decline of blue-collar work. This perception aligns with research showing that white Americans are more likely to believe DEI policies disadvantage white men than white women.

At the same time, in spite of DEI initiatives, women and people of color are most likely to be underemployed and living in poverty regardless of how much education they attain. The gender wage gap remains stark: In 2023, women working full time earned a median weekly salary of $1,005 compared with $1,202 for men − just 83.6% of what men earned. Over a 40-year career, that adds up to hundreds of thousands of dollars in lost earnings. For Black and Latina women, the disparities are even worse, with one source estimating lifetime losses at $976,800 and $1.2 million, respectively.

Racism, too, carries an economic toll. A 2020 analysis from Citi found that systemic racism has cost the U.S. economy $16 trillion since 2000. The same analysis found that addressing these disparities could have boosted Black wages by $2.7 trillion, added up to $113 billion in lifetime earnings through higher college enrollment, and generated $13 trillion in business revenue, creating 6.1 million jobs annually.

In a moment of backlash and uncertainty, I believe DEI remains a vital if imperfect tool in the American experiment of inclusion. Rather than abandon it, the challenge now, from my perspective, is how to refine it: grounding efforts not in slogans or fear, but in fairness and evidence.

The Conversation

Rodney Coates does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Beyond the backlash: What evidence shows about the economic impact of DEI – https://theconversation.com/beyond-the-backlash-what-evidence-shows-about-the-economic-impact-of-dei-252143