AI literacy: What it is, what it isn’t, who needs it and why it’s hard to define

Source: – By Daniel S. Schiff, Assistant Professor of Political Science, Purdue University

AI literacy is a lot more than simply knowing how to prompt an AI chatbot. DNY59/E+ via Getty Images

It is “the policy of the United States to promote AI literacy and proficiency among Americans,” reads an executive order President Donald Trump issued on April 23, 2025. The executive order, titled Advancing Artificial Intelligence Education for American Youth, signals that advancing AI literacy is now an official national priority.

This raises a series of important questions: What exactly is AI literacy, who needs it, and how do you go about building it thoughtfully and responsibly?

The implications of AI literacy, or lack thereof, are far-reaching. They extend beyond national ambitions to remain “a global leader in this technological revolution” or even prepare an “AI-skilled workforce,” as the executive order states. Without basic literacy, citizens and consumers are not well equipped to understand the algorithmic platforms and decisions that affect so many domains of their lives: government services, privacy, lending, health care, news recommendations and more. And the lack of AI literacy risks ceding important aspects of society’s future to a handful of multinational companies.

How, then, can institutions help people understand and use – or resist – AI as individuals, workers, parents, innovators, job seekers, students, employers and citizens? We are a policy scientist and two educational researchers who study AI literacy, and we explore these issues in our research.

What AI literacy is and isn’t

At its foundation, AI literacy includes a mix of knowledge, skills and attitudes that are technical, social and ethical in nature. According to one prominent definition, AI literacy refers to “a set of competencies that enables individuals to critically evaluate AI technologies; communicate and collaborate effectively with AI; and use AI as a tool online, at home, and in the workplace.”

AI literacy is not simply programming or the mechanics of neural networks, and it is certainly not just prompt engineering – that is, the act of carefully writing prompts for chatbots. Vibe coding, or using AI to write software code, might be fun and important, but restricting the definition of literacy to the newest trend or the latest need of employers won’t cover the bases in the long term. And while a single master definition may not be needed, or even desirable, too much variation makes it tricky to decide on organizational, educational or policy strategies.

Who needs AI literacy? Everyone, including the employees and students using it, and the citizens grappling with its growing impacts. Every sector and sphere of society is now involved with AI, even if this isn’t always easy for people to see.

Exactly how much literacy everyone needs and how to get there is a much tougher question. Are a few quick HR training sessions enough, or do we need to embed AI across K-12 curricula and deliver university micro credentials and hands-on workshops? There is much that researchers don’t know, which leads to the need to measure AI literacy and the effectiveness of different training approaches.

Ethics is an important aspect of AI literacy.

Measuring AI literacy

While there is a growing and bipartisan consensus that AI literacy matters, there’s much less consensus on how to actually understand people’s AI literacy levels. Researchers have focused on different aspects, such as technical or ethical skills, or on different populations – for example, business managers and students – or even on subdomains like generative AI.

A recent review study identified more than a dozen questionnaires designed to measure AI literacy, the vast majority of which rely on self-reported responses to questions and statements such as “I feel confident about using AI.” There’s also a lack of testing to see whether these questionnaires work well for people from different cultural backgrounds.

Moreover, the rise of generative AI has exposed gaps and challenges: Is it possible to create a stable way to measure AI literacy when AI is itself so dynamic?

In our research collaboration, we’ve tried to help address some of these problems. In particular, we’ve focused on creating objective knowledge assessments, such as multiple-choice surveys tested with thorough statistical analyses to ensure that they accurately measure AI literacy. We’ve so far tested a multiple-choice survey in the U.S., U.K. and Germany and found that it works consistently and fairly across these three countries.

There’s a lot more work to do to create reliable and feasible testing approaches. But going forward, just asking people to self-report their AI literacy probably isn’t enough to understand where different groups of people are and what supports they need.

Approaches to building AI literacy

Governments, universities and industry are trying to advance AI literacy.

Finland launched the Elements of AI series in 2018 with the hope of educating its general public on AI. Estonia’s AI Leap initiative partners with Anthropic and OpenAI to provide access to AI tools for tens of thousands of students and thousands of teachers. And China is now requiring at least eight hours of AI education annually as early as elementary school, which goes a step beyond the new U.S. executive order. On the university level, Purdue University and the University of Pennsylvania have launched new master’s in AI programs, targeting future AI leaders.

Despite these efforts, these initiatives face an unclear and evolving understanding of AI literacy. They also face challenges to measuring effectiveness and minimal knowledge on what teaching approaches actually work. And there are long-standing issues with respect to equity − for example, reaching schools, communities, segments of the population and businesses that are stretched or under-resourced.

Next moves on AI literacy

Based on our research, experience as educators and collaboration with policymakers and technology companies, we think a few steps might be prudent.

Building AI literacy starts with recognizing it’s not just about tech: People also need to grasp the social and ethical sides of the technology. To see whether we’re getting there, we researchers and educators should use clear, reliable tests that track progress for different age groups and communities. Universities and companies can try out new teaching ideas first, then share what works through an independent hub. Educators, meanwhile, need proper training and resources, not just additional curricula, to bring AI into the classroom. And because opportunity isn’t spread evenly, partnerships that reach under-resourced schools and neighborhoods are essential so everyone can benefit.

Critically, achieving widespread AI literacy may be even harder than building digital and media literacy, so getting there will require serious investment – not cuts – to education and research.

There is widespread consensus that AI literacy is important, whether to boost AI trust and adoption or to empower citizens to challenge AI or shape its future. As with AI itself, we believe it’s important to approach AI literacy carefully, avoiding hype or an overly technical focus. The right approach can prepare students to become “active and responsible participants in the workforce of the future” and empower Americans to “thrive in an increasingly digital society,” as the AI literacy executive order calls for.

The Conversation will be hosting a free webinar on practical and safe use of AI with our tech editor and an AI expert on June 24 at 2pm ET/11am PT. Sign up to get your questions answered.

The Conversation

Funding from Google Research helped to support part of the authors’ research on AI literacy.

Funding from the German Federal Ministry of Education and Research under the funding code 16DHBKI051 helped to support part of the authors’ research on AI literacy.

Arne Bewersdorff does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. AI literacy: What it is, what it isn’t, who needs it and why it’s hard to define – https://theconversation.com/ai-literacy-what-it-is-what-it-isnt-who-needs-it-and-why-its-hard-to-define-256061

The hidden bias in college admissions tests: How standardized exams can favor privilege over potential

Source: – By Zarrina Talan Azizova, Associate Professor of Education, Health and Behavior, University of North Dakota

At first glance, calls from members of Congress to restore academic merit in college admissions might sound like a neutral policy.

In our view, these campaigns often cherry-pick evidence and mask a coordinated effort that targets access and diversity in American colleges.

As scholars who study access to higher education, we have found that when these efforts are paired with pressure to reinstate standardized tests, they amount to a rollback of inclusive practices.

A Department of Education letter sent to congressional offices from Feb. 14, 2025, stated that is “unlawful for an educational institution to eliminate standardized testing to achieve a desired racial balance or to increase racial diversity.” The letter also claimed that the most widely used admissions tests, the SAT and ACT, are objective measures of merit.

In our recent peer-reviewed article, we analyzed more than 70 empirical studies about the SAT’s and ACT’s roles in college admissions. Our work found several flaws in how these exams function, especially for historically underserved students.

Measuring college readiness

Two male students sit in a campus library reviewing notes.
Supporters of admissions tests contend that they are objective tools for measuring whether students are ready for college-level coursework.
The Good Brigade/Digital Vision via Getty Images

Several elite universities – including Yale, Dartmouth and the Massachusetts Institute of Technology – have reinstated SAT or ACT requirements, reversing test-optional policies that institutions expanded during the COVID-19 pandemic.

These changes have reignited debates about how well these tests measure students’ academic preparedness and how colleges should weigh them in admissions decisions.

During a May 21, 2025, hearing of the U.S. House Subcommittee on Higher Education and Workforce Development, some witnesses argued that using test scores allows colleges to admit students based on merit. Others maintained that test scores can function as barriers to higher education.

Our research shows that while these tests are statistically reliable – that is, they produce consistent results for students across subjects and during multiple attempts under similar conditions – they are not as valid as some argue.

High school grade-point averages are typically better predictors of students’ success in college than either test.

In addition, the tests are not equitable or similarly predictive for all students, especially given gender, race and socioeconomic demographics.

That is because they systematically favor those with more access to high-quality schooling, stable socioeconomic conditions and opportunities to engage with test prep coaches and courses. That test prep can cost thousands of dollars.

In short, both tests tend to reflect privilege more than potential.

For example, students from higher-income households routinely outperform their peers on the ACT and SAT.

This isn’t surprising, considering wealthier families can afford test prep services, private tutoring and test retakes. These advantages translate into higher scores and open doors to selective colleges and scholarship opportunities.

Meanwhile, students from low-income families often face challenges – such as less experienced instructors and less access to high-level science, math and advanced placement courses – that test scores do not factor in.

Reflecting deep inequities

An overhead photo of students in a study group sitting around a small glass table.
In the U.S., high school GPA can be a better predictor than standardized tests of college success.
Clerkenwell/Vetta via Getty Images

In our published review, we found that these disparities aren’t incidental – they’re systemic.

Our review revealed long-standing evidence of bias in test design and differences in average scores along lines of race, gender and language background.

These outcomes don’t just reflect academic differences; they reflect inequities that shape how students prepare for and perform on these tests.

We also found that high school GPA outperforms standardized tests in predicting college success. GPA captures years of classroom performance, effort and teacher feedback. It reflects how students navigate real-world challenges, not just how they perform on a single timed exam.

For many students, particularly those from historically marginalized backgrounds, grades can offer a better indication of how prepared they are for college-level work.

This issue matters because admissions decisions aren’t just technical evaluations – they are value statements. Choosing to center test scores in admissions rewards certain kinds of knowledge, experiences and preparation.

The American Council on Education defines equity as opportunities for success. It means building educational environments that recognize diverse forms of potential and equip all learners to thrive.

It’s worth noting that research on testing often focuses on elite institutions, where standardized test scores are more likely to be used as high-stakes screening tools. Our systematic review found that, even in elite schools, the tests’ ability to accurately predict college academic performance is often limited (moderate in statistical terms).

But most college students attend state universities, public regional universities, minority-serving institutions, or colleges that accept most applicants. Our study found that at these institutions, standardized test scores are even less likely to predict how students will do.

This may be because state universities and public regional universities are more likely to serve highly diverse student populations, including older, part-time and first-generation students and those who are balancing work and family responsibilities.

Where does higher ed go from here?

An elevated view of college students walking up stairs.
Prioritizing standardized tests in college admissions could close the doors of opportunity for some capable students.
David Schaffer/istock via Getty Images Plus

With the debate over the role of standardized tests in the admissions process, higher education stands at a crossroads: Will colleges yield to political pressure and narrow definitions of merit and ignore equity? Or will institutions reaffirm their mission by embracing broader, fairer tools for recognizing talent and supporting student success?

The answer depends on what values are prioritized.

Our research and that of others make it clear that standardized tests should not be the gatekeepers of opportunity.

If universities define merit on test scores alone, they risk closing the doors of opportunity to capable students.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. The hidden bias in college admissions tests: How standardized exams can favor privilege over potential – https://theconversation.com/the-hidden-bias-in-college-admissions-tests-how-standardized-exams-can-favor-privilege-over-potential-256967

Companies haven’t stopped hiring, but they’re more cautious, according to the 2025 College Hiring Outlook Report

Source: – By Murugan Anandarajan, Professor of Decision Sciences and Management Information Systems, Drexel University

Recent college grads face a tough job market in 2025, but employers are still hiring. sturti/E+ via Getty Images

Every year, I tell my students in my business analytics class the same thing: “Don’t just apply for a job. Audition for it.”

This advice seems particularly relevant this year. In today’s turbulent economy, companies are still hiring, but they’re doing it a bit more carefully. More places are offering candidates short-term work experiences like internships and co-op programs in order to evaluate them before making them full-time offers.

This is just one of the findings of the 2025 College Hiring Outlook Report. This annual report tracks trends in the job market and offers valuable insights for both job seekers and employers. It is based on a national survey conducted in September 2024, with responses from 1,322 employers spanning all major industries and company sizes, from small firms to large enterprises. The survey looks at employer perspectives on entry-level hiring trends, skills demand and talent development strategies.

I am a professor of information systems at Drexel University’s LeBow College of Business in Philadelphia, and I co-authored this report along with a team of colleagues at the Center for Career Readiness.

Here’s what we found:

Employers are rethinking talent pipelines

Only 21% of the 1,322 employers we surveyed rated the current college hiring market as “excellent” or “very good,” which is a dramatic drop from 61% in 2023. This indicates that companies are becoming increasingly cautious about how they recruit and select new talent.

While confidence in full-time hiring has declined, employers are not stepping away from hiring altogether. Instead, they’re shifting to paid and unpaid internships, co-ops and contract-to-hire roles as a less risky route to identify talent and “de-risk” full-time hiring.

Employers we surveyed described internships as a cost-effective talent pipeline, and 70% told us they plan to maintain or increase their co-op and intern hiring in 2025. At a time when many companies are tightening their belts, hiring someone who’s already proved themselves saves on onboarding reduces turnover and minimizes potentially costly mishires.

For job seekers, this makes every internship or short-term role more than a foot in the door. It’s an extended audition. Even with the general market looking unstable, interest in co-op and internship programs appears steady, especially among recent graduates facing fewer full-time opportunities.

These programs aren’t just about trying out a job. They let employers see if a candidate shows initiative, good judgment and the ability to work well on a team, which we found are traits employers value even more than technical skills.

What employers want

We found that employers increasingly prioritize self-management skills like adaptability, ethical reasoning and communication over technical skills such as digital literacy and cybersecurity. Employers are paying attention to how candidates behave during internships, how they take feedback, and whether they bring the mindset needed to grow with the company.

This reflects what I have observed in classrooms and in conversations with hiring managers: Credentials matter, but what truly sets candidates apart is how they present themselves and what they contribute to a company.

Based on co-op and internship data we’ve collected at Drexel, however, many students continue to believe that technical proficiency is the key to getting a job.

In my opinion, this disconnect reveals a critical gap in expectations: While students focus on hard skills to differentiate themselves, employers are looking for the human skills that indicate long-term potential, resilience and professionalism. This is especially true in the face of economic uncertainty and the ambiguous, fast-changing nature of today’s workplace.

Technology is changing how hiring happens

Employers also told us that artificial intelligence is now central to how both applicants and employers navigate the hiring process.

Some companies are increasingly using AI-powered platforms to transform their hiring processes. For example, Children’s Hospital of Philadelphia uses platforms like HireVue to conduct asynchronous video interviews. HR-focused firms like Phenom and JJ Staffing Services also leverage technologies such as AI-based resume ranking, automated interview scheduling and one-way video assessments.

Not only do these tools speed up the hiring process, but they also reshape how employers and candidates interact. In our survey, large employers said they are increasingly relying on AI tools like resume screeners and one-way video interviews to manage large numbers of job applicants. As a result, the candidate’s presence, clarity in communication and authenticity are being evaluated even before a human recruiter becomes involved.

At the same time, job seekers are using generative AI tools to write cover letters, practice interviews or reformat resumes. These tools can help with preparation, but overreliance on them can backfire. Employers want authenticity, and many employers we surveyed mentioned they notice when applications seem overly robotic.

In my experience as a professor, the key is teaching students to use AI to enhance their effort and not replace it. I encourage them to leverage AI tools but always emphasize that the final output and the impression it makes should reflect their own thinking and professionalism. The bottom line is that hiring is still a human decision, and the personal impression you make matters.

This isn’t just about new grads

While our research focuses on early-career hiring, these findings apply to other audiences as well, such as career changers, returning professionals and even mid-career workers. These workers are increasingly being evaluated on their adaptability, behavior and collaborative ability – not just their experience.

Many companies now offer project-based assignments and trial roles that let them evaluate performance before making a permanent hire.

At the same time, employers are investing in internal reskilling and upskilling programs. Reskilling refers to training workers for entirely new roles, often in response to job changes or automation, while upskilling means helping employees deepen their current skills to stay effective and advance in their existing roles. Our report indicates that approximately 88% of large companies now offer structured upskilling and reskilling programs. For job seekers and workers alike, staying competitive means taking the initiative and demonstrating a commitment to learning and growth.

Show up early, and show up well

So what can students, or anyone entering or reentering the workforce, do to prepare?

  • Start early. Don’t wait until senior year. First- and second-year internships are growing in importance.

  • Sharpen your soft skills. Communication, time management, problem-solving and ethical behavior are top priorities for employers.

  • Understand where work is happening. Over 50% of entry-level jobs are fully in-person. Only 4% are fully remote. Show up ready to engage.

  • Use AI strategically. It’s a useful tool for research and practice, not a shortcut to connection or clarity.

  • Stay curious. Most large employers now offer reskilling or upskilling opportunities – and they expect employees to take initiative.

One of the clearest takeaways from this year’s report is that hiring is no longer a one-time decision. It’s a performance process that often begins before an interview is even scheduled.

Whether you’re still in school, transitioning in your career or returning to the workforce after a break, the same principle applies: Every opportunity is an audition. Treat it like one.

The Conversation

Murugan Anandarajan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Companies haven’t stopped hiring, but they’re more cautious, according to the 2025 College Hiring Outlook Report – https://theconversation.com/companies-havent-stopped-hiring-but-theyre-more-cautious-according-to-the-2025-college-hiring-outlook-report-257870

Trump administration’s conflicting messages on Chinese student visas reflect complex US-China relations

Source: – By Meredith Oyen, Associate Professor of History and Asian Studies, University of Maryland, Baltimore County

The U.S. announced plans to scrutinize and revoke student visas for students with ties to the Chinese Communist Party or whose studies are in critical fields, but appears to have reconsidered. The decision and apparent about-face could have a wide-ranging impact on both nations. LAW Ho Ming/Getty Images

President Donald Trump appears to have walked back plans for the U.S. State Department to scrutinize and revoke visas for Chinese students studying in the country.

On June 11, 2025, Trump posted on his social media platform TruthSocial that visas for Chinese students would continue and that they are welcome in the United States, as their presence “has always been good with me!”

The announcement came weeks after Secretary of State Marco Rubio announced that his department would begin scrutinizing and revoking student visas for Chinese nationals with ties to the Chinese Communist Party, or whose studies are in critical fields.

The contradictory moves have led to confusion among Chinese students attending college or considering studying in the United States.

Over time, Chinese nationals have faced barriers to studying in the U.S. As a scholar who studies relations between the two nations, I argue that efforts to ban Chinese students in the United States are not unprecedented, and historically they have come with consequences.

Student visas under fire

Two students sit side by side studying in a library.
The Trump administration laid out the terms for revoking or denying student visas to Chinese nationals but then backtracked.
STAP/Getty Images

Since the late 1970s, millions of Chinese students have been granted visas to study at American universities. That total includes approximately 277,000 who studied in the United States in the 2023-2024 academic year.

It is difficult to determine how many of these students would have been affected by a ban on visas for individuals with Chinese Community Party affiliations or in critical fields.

Approximately 40% of all new members of the Chinese Communist Party each year are drawn from China’s student population. And many universities in China have party connections or charters that emphasize party loyalty.

The “critical fields” at risk were not defined. A majority of Chinese students in the U.S. are enrolled in math, technology, science and engineering fields.

A long history

A student holding chalks writes Mandarin text on a blackboard.
Since the late 1970s, the number of Chinese students attending college in the U.S. has increased dramatically.
Kenishiroite/Getty Images

Yung Wing became the first Chinese student to graduate from a U.S. university in 1852.

Since then, millions of Chinese students have come to the United States to study, supported by programs such as the “Chinese Educational Mission,” Boxer Indemnity Fund scholarships and the Fulbright Program.

The Institute for International Education in New York estimated the economic impact of Chinese students in the U.S. at over US$14 billion a year. Chinese students tend to pay full tuition to their universities. At the graduate level, they perform vital roles in labs and classrooms. Just under half of all Chinese students attending college in the U.S. are graduate students.

However, there is a long history of equating Chinese migrants as invaders, spies or risks to national security.

After the outbreak of the Korean War in 1950, the U.S. Department of Justice began to prevent Chinese scholars and students in STEM fields – science, technology, engineering and math – from returning to China by stopping them at U.S. ports of entry and exit. They could be pulled aside when trying to board a flight or ship and their tickets canceled.

In one infamous case, Chinese rocket scientist Qian Xuesen was arrested, harassed, ordered deported and prevented from leaving over five years from 1950 to 1955. In 1955, the United States and China began ambassadorial-level talks to negotiate repatriations from either country. After his experience, Qian became a much-lauded supporter of the Communist government and played an important role in the development of Chinese transcontinental missile technology.

During the 1950s, the U.S. Department of Justice raided Chinatown organizations looking for Chinese migrants who arrived under false names during the Chinese Exclusion Era, a period from the 1880s to 1940s when the U.S. government placed tight restrictions on Chinese immigration into the country. A primary justification for the tactics was fear that the Chinese in the U.S. would spy for their home country.

Between 1949 and 1979, the U.S and China did not have normal diplomatic relations. The two nations recognized each other and exchanged ambassadors starting in January 1979. In the more than four decades since, the number of Chinese students in the U.S. has increased dramatically.

Anti-Chinese discrimination

The idea of an outright ban on Chinese student visas has raised concerns about increased targeting of Chinese in the U.S. for harassment.

In 1999, Taiwanese-American scientist Wen Ho Lee was arrested on suspicion of using his position at Los Alamos National Laboratory in New Mexico to spy for China. Lee remained imprisoned in solitary confinement for 278 days before he was released without a conviction.

In 2018, during the first Trump administration, the Department of Justice launched its China Initiative. In its effort to weed out industrial, technological and corporate espionage, the initiative targeted many ethnic Chinese researchers and had a chilling effect on continued exchanges, but it secured no convictions for wrongdoing.

Trump again expressed concerns last year that undocumented migrants from China might be coming to the United States to spy or “build an army.”

The repeated search for spies among Chinese migrants and residents in the U.S. has created an atmosphere of fear for Chinese American communities.

Broader foreign policy context

two puzzle pieces — one representing the United States flag and the other the Chinese flag — stand separated against a neutral background
An atmosphere of suspicion has altered the climate for Chinese international students.
J Studios/Getty Images

The U.S. plan to revoke visas for students studying in the U.S. and the Chinese response is being formed amid contentious debates over trade.

Chinese Ministry of Foreign Affairs spokesperson Lin Jian accused the U.S. of violating an agreement on tariff reduction the two sides discussed in Geneva in May, citing the visa issues as one example.

Trump has also complained that the Chinese violated agreements between the countries, and some reports suggest that the announcement on student visas was a negotiating tactic to change the Chinese stance on the export of rare earth minerals.

When Trump announced his trade deal with China on June 11, he added a statement welcoming Chinese students.

However, past practice shows that the atmosphere of uncertainty and suspicion may have already damaged the climate for Chinese international students, and at least some degree of increased scrutiny of student visas will likely continue regardless.

The Conversation

Meredith Oyen does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Trump administration’s conflicting messages on Chinese student visas reflect complex US-China relations – https://theconversation.com/trump-administrations-conflicting-messages-on-chinese-student-visas-reflect-complex-us-china-relations-258351

Diversifying the special education teacher workforce could benefit US schools

Source: – By Elizabeth Bettini, Assistant Professor of Special Education, Boston University

The demographics of the special education teacher workforce have remained static, but the student population these educators serve is becoming more diverse. Courtney Hale/E+ via Getty Images

Teachers of color positively impact all students, including students of color with disabilities. Yet, the special education teacher workforce is overwhelmingly white.

In our recent research, we found that special education teacher demographics are not keeping pace with changes in the student population.

In 2012, about 80% of U.S. public school teachers were white, including about 80% of special education teachers, while less than 20% were teachers of color. By contrast, in the same year, students of color constituted 47% of those diagnosed with disabilities.

In our recent study, we examined whether these numbers have changed. Analyzing multiple national datasets on the teacher workforce, we found the proportion of special education teachers of color has been static, even as the student population is rapidly becoming more diverse.

So, the special education teacher workforce is actually becoming less representative of the student population over time. Specifically, in 2012, 16.5% of special education teachers were people of color, compared with 17.1% in 2021. In that same span, the share of students with disabilities who are students of color rose from 47.3% in 2012 to 53.9% in 2021.

In fact, for the special education teacher workforce to become representative of the student population, U.S. schools would need to triple the number of special education teachers of color.

As scholars who study teacher recruitment and retention and teacher working conditions, we are concerned that this disparity will affect the quality of education students receive.

Why does a diverse teacher workforce matter?

A Black teacher stands before rows of students sitting at desks.
Without more support from the government, the U.S. teacher workforce is likely to remain predominantly white.
gradyreese/iStock via Getty Images

For children of color, the research is clear: Teachers of color are, on average, more effective than white teachers in providing positive educational experiences and outcomes for students of color, including students of color with disabilities.

One study found that low-income Black male students who had one Black teacher in third, fourth or fifth grade were 39% less likely to drop out of high school and 29% more likely to enroll in college.

Moreover, teachers of color are just as effective as white teachers – and sometimes more effective – in teaching white students.

Providing pathways

The U.S. has institutions dedicated to attracting and retaining educators of color: Programs at historically Black colleges and universities, Hispanic-serving institutions and other minority-serving institutions prepare a substantial number of new teachers of color annually.

Further, many local initiatives support educators of color and attract teachers who might not otherwise have opportunities to join the profession.

These include: Grow Your Own programs that recruit effective teachers of color from local communities, teacher residency programs that help schools retain teachers of color, and
scholarships and loan forgiveness programs that support all teachers, including teachers of color.

However, the U.S. educator workforce faces broad challenges with declining interest in the teaching profession and declining enrollment in teacher preparation programs. In this context, our findings indicate that without significant investments, the teacher workforce is likely to remain predominately white – at significant cost to students with disabilities.

Anti-DEI movement cuts funding

A frustrated student sits at a desk while a teacher stands behind her in the background.
The Trump administration has canceled teacher preparation grants that recruit teachers of color and has taken other actions that could lead to a less diverse and skilled educator workforce.
Klaus Vedfelt/Getty Images

While there have been long-standing challenges, recent steps taken by the Trump administration could limit efforts to boost teacher diversity.

In its push to end diversity, equity and inclusion programs, the administration has cut grant funding for programs designed to develop a diverse educator workforce.

The administration has also cut millions of dollars dedicated to training teachers to work in underfunded, high-poverty schools and has threatened additional funding cuts to universities engaging in equity-based work.

These federal actions make the teacher workforce less adept at addressing the substantial challenges facing U.S. schools, such as declining interest in the teaching profession and and persistent racial disparities in student outcomes.

Given the strong evidence of the benefits of teachers of color and the national trends that our research uncovered, federal and state investments should prioritize supporting prospective teachers of color.

The Conversation

Elizabeth Bettini’s research has been funded by the US Department of Education’s National Center for Special Education Research within the Institute of Education Sciences, the US Department of Education’s Office of Special Education Programs, and the Spencer Foundation. She is affiliated with the Council for Exceptional Children’s Division for Research and Teacher Education Division, for which she edits the journal Teacher Education and Special Education.

LaRon A. Scott has received funding from the U.S. Department of Education Office of Special Education Programs. He is affiliated with the Council for Exceptional Children’s Teacher Education Division and the American Association for Individuals with Intellectual and Developmental Disabilities.

Tuan D. Nguyen receives funding from the National Science Foundation to do work around STEM teachers and computer science education.

ref. Diversifying the special education teacher workforce could benefit US schools – https://theconversation.com/diversifying-the-special-education-teacher-workforce-could-benefit-us-schools-254916

Blocking exports and raising tariffs is a bad defense against industrial cyber espionage, study shows

Source: – By William Akoto, Assistant Professor of Global Security, American University

Cutting off China’s access to advanced U.S. chips is likely to motivate Chinese cyber espionage. kritsapong jieantaratip/iStock via Getty Images

The United States is trying to decouple its economy from rivals like China. Efforts toward this include policymakers raising tariffs on Chinese goods, blocking exports of advanced technology and offering subsidies to boost American manufacturing. The goal is to reduce reliance on China for critical products in hopes that this will also protect U.S. intellectual property from theft.

The idea that decoupling will help stem state-sponsored cyber-economic espionage has become a key justification for these measures. For instance, then-U.S. Trade Representative Katherine Tai framed the continuation of China-specific tariffs as serving the “statutory goal to stop [China’s] harmful … cyber intrusions and cyber theft.” Early tariff rounds during the first Trump administration were likewise framed as forcing Beijing to confront “deeply entrenched” theft of U.S. intellectual property.

This push to “onshore” key industries is driven by very real concerns. By some estimates, theft of U.S. trade secrets, often through hacking – costs the American economy hundreds of billions of dollars per year. In that light, decoupling is a defensive economic shield – a way to keep vital technology out of an adversary’s reach.

But will decoupling and cutting trade ties truly make America’s innovations safer from prying eyes? I’m a political scientist who studies state-sponsored cyber espionage, and my research suggests that the answer is a definitive no. Indeed, it might actually have the opposite effect.

To understand why, it helps to look at what really drives state-sponsored hacking.

Rivalry, not reliance

Intuitively, you might think a country is most tempted to steal secrets from a nation it depends on. For example, if Country A must import jet engines or microchips from Country B, Country A might try to hack Country B’s companies to copy that technology and become self-sufficient. This is the industrial dependence theory of cyber theft.

There is some truth to this motive. If your economy needs what another country produces, stealing that know-how can boost your own industries and reduce reliance. However, in a recent study, I show that a more powerful predictor of cyber espionage is industrial similarity. Countries with overlapping advanced industries such as aerospace, electronics or pharmaceuticals are the ones most likely to target each other with cyberattacks.

Why would having similar industries spur more spying? The reason is competition. If two nations both specialize in cutting-edge sectors, each has a lot to gain by stealing the other’s innovations.

If you’re a tech powerhouse, you have valuable secrets worth stealing, and you have the capability and motivation to steal others’ secrets. In essence, simply trading with a rival isn’t the core issue. Rather, it’s the underlying technological rivalry that fuels espionage.

For example, a cyberattack in 2012 targeted SolarWorld, a U.S. solar panel manufacturer, and the perpetrators stole the company’s trade secrets. Chinese solar companies then developed competing products based on the stolen designs, costing SolarWorld millions in lost revenue. This is a classic example of industrial similarity at work. China was building its own solar industry, so it hacked a U.S. rival to leapfrog in technology.

China has made major investments in its cyber-espionage capabilities.

Boosting trade barriers can fan the flames

Crucially, cutting trade ties doesn’t remove this rivalry. If anything, decoupling might intensify it. When the U.S. and China exchange tariff blows or cut off tech transfers, it doesn’t make China give up – it likely pushes Chinese intelligence agencies to work even harder to steal what they can’t buy.

This dynamic isn’t unique to China. Any country that suddenly loses access to an important technology may turn to espionage as Plan B.

History provides examples. When South Africa was isolated by sanctions in the 1980s, it covertly obtained nuclear weapons technology. Similarly, when Israel faced arms embargoes in the 1960s, it engaged in clandestine efforts to get military technology. Isolation can breed desperation, and hacking is a low-cost, high-reward tool for the desperate.

If decoupling won’t end cyber espionage, what will?

There’s no easy fix for state-sponsored hacking as long as countries remain locked in high-tech competition. However, there are steps that can mitigate the damage and perhaps dial down the frequency of these attacks.

One is investing in cyber defense. Just as a homeowner adds locks and alarms after a burglary, companies and governments should continually strengthen their cyber defenses. Assuming that espionage attempts are likely to happen is key. Advanced network monitoring, employee training against phishing, and robust encryption can make it much harder for hackers to succeed, even if they keep trying.

Another is building resilience and redundancy. If you know that some secrets might get stolen, plan for it. Businesses can shorten product development cycles and innovate faster so that even if a rival copies today’s tech, you’re already moving on to the next generation. Staying ahead of thieves is a form of defense, too.

Ultimately, rather than viewing tariffs and export bans as silver bullets against espionage, U.S. leaders and industry might be safer focusing on resilience and stress-testing cybersecurity firms. Make it harder for adversaries to steal secrets, and less rewarding even if they do.

The Conversation

William Akoto does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Blocking exports and raising tariffs is a bad defense against industrial cyber espionage, study shows – https://theconversation.com/blocking-exports-and-raising-tariffs-is-a-bad-defense-against-industrial-cyber-espionage-study-shows-258243

What is reconciliation − the legislative shortcut Republicans are using to push through their ‘big, beautiful bill’?

Source: – By Linda J. Bilmes, Daniel Patrick Moynihan Senior Lecturer in Public Policy and Public Finance, Harvard Kennedy School

Senate Majority Leader John Thune speaks with reporters about the reconciliation process to advance President Donald Trump’s spending and tax bill on June 3, 2025. AP Photo/J. Scott Applewhite

The word “reconciliation” sounds benign, even harmonious.

But in Washington, D.C., reconciliation refers to a potent legislative shortcut that allows the party in power to avoid opposition and enact sweeping changes to taxes and spending with a simple majority vote. Democrats used the process to pass the Inflation Reduction Act in 2022. Reconciliation helped Republicans pass large tax cuts in 2017.

Reconciliation is also at the heart of the current budget debate, as Senate Republicans rush to advance their version of the “One Big Beautiful Bill Act,” also known by its acronym OBBBA, which passed the House in May 2025.

I served as assistant secretary of Commerce for management and budget during the Clinton administration, when my colleagues and I helped forge bipartisan legislation that balanced the federal budget and produced surpluses over four years, from 1998 to 2001. We were even able to pay off some debt.

But since 2001, the country’s fiscal situation has deteriorated significantly. And the reconciliation process has strayed from its original purpose as a mechanism to promote sound fiscal policy. Instead, it is now used to pass partisan legislation, often without regard to its economic impact on future generations of Americans.

Reconciliation 101

The reconciliation process was created by the Congressional Budget Act of 1974, which was overwhelmingly supported by both parties. It was designed to align policy goals with budget targets to help rein in deficits.

The rules specify that a bill using the reconciliation process must pertain directly to budgetary or fiscal matters, cannot change Social Security, Medicare or the budget process itself, or deliberately extend deficits beyond a 10-year window. As part of the process, the parliamentarian goes through each element of the bill and determines whether it meets the requirements, removing any that don’t.

This caused the One Big Beautiful Bill Act to hit a snag in the Senate on June 25, 2025, after the parliamentarian ruled several major parts of it couldn’t be included as written, such as an effort to crack down on efforts by states to get more Medicaid funds and a limit on student debt repayment options.

In the Senate, reconciliation has special procedural advantages. Debate is limited to 20 hours. Conveniently for the party in power, the final bill can pass with a simple majority of 51 votes. This avoids the usual 60-vote threshold needed to overcome a filibuster.

Over its 50-year history, 23 reconciliation bills have become law.

Reconciliation on rise as budget process breaks down

Over time, reconciliation has become the dominant method for enacting major tax and spending legislation, as the regular congressional budget process has broken down.

Since 1974, there have been multiple government shutdowns, near-shutdowns and short-term, stopgap “continual resolutions” instead of annual budgets, accompanied by rising deficits and national debt.

With few other tools at its disposal, Congress has used reconciliation to push through many pieces of major economic legislation, including the 2001 and 2003 tax cuts under President George W. Bush, the 2017 tax cuts during President Donald Trump’s first term, and the American Rescue Plan in 2021 and the Inflation Reduction Act in 2022 during the Biden administration.

However, reconciliation has significant flaws. Because debate is limited, senators often vote on bills over 1,000 pages long with little time to review the details. And once tax cuts are enacted under reconciliation, it is devilishly hard to get rid of them.

Given the compressed timelines and lack of transparency inherent in such huge, messy spending bills, it is fairly easy for lawmakers to slip in earmarks, tax loopholes and other extraneous items that that don’t get removed by the parliamentarian.

a Black man points the ceiling as he stands in front of a lectern and two poster boards
House Minority Leader Hakeem Jeffries argues Republicans’ spending and tax bill will ‘explode the deficit.’
AP Photo/J. Scott Applewhite

What’s in the bill?

At the heart of the One Big Beautiful Bill Act, passed by the House, is an extension of President Trump’s tax cuts from his first term, which would otherwise expire at the end of 2025, according to the procedural rules for reconciliation.

But it also includes multiple new tax cuts – such as an end to taxes on overtime and tips and lower estate taxes – introduces new Medicaid work requirements and repeals various energy credits. In line with the Trump administration’s policies, the bill slashes federal funding for education, Medicaid, public housing, environmental programs, scientific research and some national park and public land protection programs. It also boosts defense spending.

The bill would sharply worsen the nation’s fiscal outlook, according to analyses by the nonpartisan Congressional Budget Office and other organizations.

Currently, the national debt exceeds US$36 trillion, according to the U.S. Treasury, and net interest payments account for some 16% of federal revenue, based on the Congressional Budget Office’s projections for 2025.

In its analysis, the Congressional Budget Office – which was also created by the 1974 act – said the House-passed version would increase deficits by more than $3.1 trillion over the next decade. The overwhelming share of this cost comes from the permanent extension of individual tax cuts initially enacted in 2017.

According to the Congressional Budget Office’s analysis, by 2035 households earning at least $1 million would receive an average annual tax cut of about $45,000. Most middle- and lower-income households would receive a cut of less than $500 per year, if anything.

The costs of reconciliation

A number of Senate Republicans have questioned some aspects of the reconciliation package. Since they hold only a 53-47 majority, and with all Democrats expected to vote “no,” they need to use reconciliation to pass their version.

Although it differs from the House version in many ways, the Senate version still favors tax cuts for high-income households and large corporations.

Senate Republicans also employ a flawed accounting gimmick to minimize its apparent cost. It assumes the 2017 Trump tax cuts, which are set to expire, have already been extended and embeds that assumption into the budget baseline.

This makes extending the tax cuts appear costless, even though it would grow the debt substantially. The move violates normal scorekeeping conventions and misleads the public. Honest accounting would show that the Senate plan would add to the debt about $500 billion more than the House version.

Abusing the process

Lots of wrangling and changes are expected before the Senate is able to pass its version. After that, the House and Senate will need to resolve their differences in a conference committee of Republicans from each house of Congress.

Once they agree on a final version, each house votes again – and the Senate version will still need to meet the terms of reconciliation in order to pass with a majority vote. President Trump is pressuring Congress to deliver the bill to his desk before he goes on July Fourth vacation.

In my view, while reconciliation remains a powerful budgetary tool, its current use represents a fundamental inversion of its original purpose. Americans deserve an honest debate about trade-offs, rather than more debt in disguise. Some estimates of the fiscal impact of the Senate’s version of the bill are as high as $3.8 trillion over a decade. Simply waving a magic accounting wand won’t make them go away.

This article was updated to include a Senate parliamentarian ruling about several provisions of the Republican bill.

The Conversation

Linda J. Bilmes served as Deputy Assistant Secretary of the US Department of Commerce from 1997-1998 and as CFO and Assistant Secretary for Management, Budget and Administration from 1999-2001.

ref. What is reconciliation − the legislative shortcut Republicans are using to push through their ‘big, beautiful bill’? – https://theconversation.com/what-is-reconciliation-the-legislative-shortcut-republicans-are-using-to-push-through-their-big-beautiful-bill-255487

Why energy markets fluctuate during an international crisis

Source: – By Skip York, Nonresident Fellow in Energy and Global Oil, Baker Institute for Public Policy, Rice University

Stock and commodities traders found themselves dealing with various price swings as energy markets responded to Israeli and U.S. attacks on Iran. Timothy A. Clary/AFP via Getty Imagesf

Global energy markets, such as those for oil, gas and coal, tend to be sensitive to a wide range of world events – especially when there is some sort of crisis. Having worked in the energy industry for over 30 years, I’ve seen how war, political instability, pandemics and economic sanctions can significantly disrupt energy markets and impede them from functioning efficiently.

A look at the basics

First, consider the economic fundamentals of supply and demand. The risk most people imagine in the current crisis between Israel, the U.S. and Iran is that Iran, which is itself a major oil-producing country, might suddenly expand the conflict by threatening the ability of neighboring countries to supply oil to the world.

Oil wells, refineries, pipelines and shipping lanes are the backbone of energy markets. They can be vulnerable during a crisis: Whether there is deliberate sabotage or collateral damage from military action, energy infrastructure often takes a hit.

For instance, after Saddam Hussein invaded Kuwait in August 1990, Iraqi forces placed explosive charges on Kuwaiti oil wells and began detonating them in January 1991. It took months for all the resulting fires to be put out, and millions of barrels of oil and hundreds of millions of cubic meters of natural gas were released into the environment – rather than being sold and used productively somewhere around the world.

Scenes of Kuwaiti life during and after the Gulf War of 1990 and 1991 include images of oil wells burning as a result of Iraqi sabotage.

Logistics can mess markets up too. For instance, closing critical maritime routes like the Strait of Hormuz or the Suez Canal can cause transportation delays.

Whether supply is lost from decreased production or blocked transportation routes, the effect is less oil available to the market, which not only causes prices to rise in general, but it also makes them more volatile – tending to change more frequently and by larger amounts.

On the flip side, demand can also shift radically. During the 1990-1991 Gulf War, demand rose: U.S. forces alone used more than 2 billion gallons of fuel, according to an Army analysis. By contrast, during the COVID-19 pandemic, industries shut down, travel came to a halt and energy demand plummeted.

When crisis looms, countries and companies often start stockpiling oil and other raw materials rather than buying only what they need right now. That creates even more imbalance, resulting in price volatility that leaves everyone, both consumers and producers, with a headache.

Regional considerations

In addition to uncertainties around market fundamentals, it’s important to note that many of the world’s energy reserves are located in regions that have not been models of stability. In the Middle East, wars, revolutions and diplomatic disputes there can raise concerns about supply, demand or both.

Those worries send shock waves through the world’s energy markets. It’s like walking on a tightrope: One wrong move – or even the perception of a misstep – can make the market wobble.

Governments’ economic sanctions, such as those restricting trade with Iran, Russia or Venezuela, can distort production and investment decisions and disrupt trade flows. Sometimes markets react even before sanctions are officially in place: Just the rumor of a possible embargo can cause prices to spike as buyers scramble to secure resources.

In 2008, for example, India and Vietnam imposed rice export bans, and rumors of additional restrictions fueled panic buying and nearly doubled prices in months.

In those scrambles, the role of investor speculation enters the picture. Energy commodities, such as oil and gas, aren’t just physical resources; they’re also traded as financial assets like stocks and bonds. During uncertain times, traders don’t wait around for actual changes in supply and demand. They react to news and forecasts, sometimes in large groups, which can shift the market just with the actions that result from their fears or hopes.

The events on June 22, 2025, are a good example of how this dynamic works. The Iranian parliament passed a resolution authorizing the country’s Supreme Council to close the Strait of Hormuz. Immediately, oil prices started rising, even though the strait was still open, with oil tankers steaming through unimpeded.

The next day, Iran launched a missile strike on Qatar, but coordinated in advance with Qatari officials to minimize damage and casualties. Traders and analysts perceived the action as a de-escalatory signal and anticipated that the Supreme Council was not going to close the strait. So prices started to fall.

It was a price roller coaster, fueled by speculation rather than reality. And computer algorithms and artificial intelligence, which assist in making automated trades, only add to the chaos of price changes.

Shipping activity in the Persian Gulf and the Strait of Hormuz decreased after Israel’s attacks on Iranian nuclear facilities.

A broader look

International crises can also cause wider changes in countries’ economies – or the global economy as a whole – which in turn affect the energy market.

If a crisis sparks a recession, rising inflation or high unemployment, those tend to cause people and businesses to use less energy. When the underlying situation stabilizes, recovery efforts can mean energy consumption resumes. But it’s like a pendulum swinging back and forth, with energy markets caught in the middle.

Renewable energy is not immune to international crisis and chaos. The supply is less affected by market forces: The amount of available sunlight and wind isn’t tied to geopolitical relations. But overall economic conditions still affect demand, and a crisis can disrupt the supply chains for the equipment needed to harness renewable energy, like solar panels and wind turbines.

It’s no wonder energy markets are so jittery during international crises. A mix of imbalances between supply and demand, vulnerable infrastructure, political tensions, corporate worries and speculative trading all weave together into a complex web of volatility.

For policymakers, investors and consumers, understanding these dynamics is key to navigating the ups and downs of energy markets in a crisis-prone world. The solutions aren’t simple, but being informed is the first step toward stability.

The Conversation

Skip York is a nonresident fellow for Global Oil and Energy with the Center for Energy Studies at Rice University’s Baker Institute for Public Policy. He also is the Chief Energy Strategist at Turner Mason & Company, an energy consulting firm.

ref. Why energy markets fluctuate during an international crisis – https://theconversation.com/why-energy-markets-fluctuate-during-an-international-crisis-259839

To spur the construction of affordable, resilient homes, the future is concrete

Source: – By Pablo Moyano Fernández, Assistant Professor of Architecture, Washington University in St. Louis

A modular, precast system of concrete ‘rings’ can be connected in different ways to build a range of models of energy-efficient homes. Pablo Moyano Fernández, CC BY-SA

Wood is, by far, the most common material used in the U.S. for single-family home construction.

But wood construction isn’t engineered for long-term durability, and it often underperforms, particularly in the face of increasingly common extreme weather events.

In response to these challenges, I believe mass-produced concrete homes can offer affordable, resilient housing in the U.S. By leveraging the latest innovations of the precast concrete industry, this type of homebuilding can meet the needs of a changing world.

Wood’s rise to power

Over 90% of the new homes built in the U.S. rely on wood framing.

Wood has deep historical roots as a building material in the U.S., dating back to the earliest European settlers who constructed shelters using the abundant native timber. One of the most recognizable typologies was the log cabin, built from large tree trunks notched at the corners for structural stability.

A mother holds her child in the front doorway of their log cabin home.
Log cabins were popular in the U.S. during the 18th and 19th centuries.
Heritage Art/Heritage Images via Getty Images

In the 1830s, wood construction underwent a significant shift with the introduction of balloon framing. This system used standardized, sawed lumber and mass-produced nails, allowing much smaller wood components to replace the earlier heavy timber frames. It could be assembled by unskilled labor using simple tools, making it both accessible and economical.

In the early 20th century, balloon framing evolved into platform framing, which became the dominant method. By using shorter lumber lengths, platform framing allowed each floor to be built as a separate working platform, simplifying construction and improving its efficiency.

The proliferation and evolution of wood construction helped shape the architectural and cultural identity of the nation. For centuries, wood-framed houses have defined the American idea of home – so much so that, even today, when Americans imagine a house, they typically envision one built of wood.

A row of half-constructed homes surrounded by piles of dirt.
A suburban housing development from the 1950s being built with platform framing.
H. Armstrong Roberts/ClassicStock via Getty Images

Today, light-frame wood construction dominates the U.S. residential market.

Wood is relatively affordable and readily available, offering a cost-effective solution for homebuilding. Contractors are familiar with wood construction techniques. In addition, building codes and regulations have long been tailored to wood-frame systems, further reinforcing their prevalence in the housing industry.

Despite its advantages, wood light-frame construction presents several important limitations. Wood is vulnerable to fire. And in hurricane- and tornado-prone regions, wood-framed homes can be damaged or destroyed.

Wood is also highly susceptible to water-related issues, such as swelling, warping and structural deterioration caused by leaks or flooding. Vulnerability to termites, mold, rot and mildew further compromise the longevity and safety of wood-framed structures, especially in humid or poorly ventilated environments.

The case for concrete

Meanwhile, concrete has revolutionized architecture and engineering over the past century. In my academic work, I’ve studied, written and taught about the material’s many advantages.

The material offers unmatched strength and durability, while also allowing design flexibility and versatility. It’s low-cost and low-maintenance, and it has high thermal mass properties, which refers to the material’s ability to absorb and store heat during the day, and slowly release it during the cooler nights. This can lower heating and cooling costs.

Properly designed concrete enclosures offer exceptional performance against a wide range of hazards. Concrete can withstand fire, flooding, mold, insect infestation, earthquakes, hail, hurricanes and tornadoes.

It’s commonly used for home construction in many parts of the world, such as Europe, Japan, Mexico, Brazil and Argentina, as well as India and other parts of Southeast Asia.

However, despite their multiple benefits, concrete single-family homes are rare in the U.S.

That’s because most concrete structures are built using a process called cast-in-place. In this technique, the concrete is formed and poured directly at the construction site. The method relies on built-in-place molds. After the concrete is cast and cured over several days, the formwork is removed.

This process is labor-intensive and time-consuming, and it often produces considerable waste. This is particularly an issue in the U.S., where labor is more expensive than in other parts of the world. The material and labor cost can be as high as 35% to 60% of the total construction cost.

Portland cement, the binding agent in concrete, requires significant energy to produce, resulting in considerable carbon dioxide emissions. However, this environmental cost is often offset by concrete’s durability and long service life.

Concrete’s design flexibility and structural integrity make it particularly effective for large-scale structures. So in the U.S., you’ll see it used for large commercial buildings, skyscrapers and most highways, bridges, dams and other critical infrastructure projects.

But when it comes to single-family homes, cast-in-place concrete poses challenges to contractors. There are the higher initial construction costs, along with a lack of subcontractor expertise. For these reasons, most builders and contractors stick with what they know: the wood frame.

A new model for home construction

Precast concrete, however, offers a promising alternative.

Unlike cast-in-place concrete, precast systems allow for off-site manufacturing under controlled conditions. This improves the quality of the structure, while also reducing waste and labor.

The CRETE House, a prototype I worked on in 2017 alongside a team at Washington University in St. Louis, showed the advantages of a precast home construction.

To build the precast concrete home, we used ultra-high-performance concrete, one of the latest advances in the concrete industry. Compared with conventional concrete, it’s about six times stronger, virtually impermeable and more resistant to freeze-thaw cycles. Ultra-high-performance concrete can last several hundred years.

The strength of the CRETE House was tested by shooting a piece of wood at 120 mph (193 kph) to simulate flying debris from an F5 tornado. It was unable to breach the wall, which was only 2 inches (5.1 centimeters) thick.

The wall of the CRETE House was able to withstand a piece of wood fired at 120 mph (193 kph).

Building on the success of the CRETE House, I designed the Compact House as a solution for affordable, resilient housing. The house consists of a modular, precast concrete system of “rings” that can be connected to form the entire structure – floors, walls and roofs – creating airtight, energy-efficient homes. A series of different rings can be chosen from a catalog to deliver different models that can range in size from 270 to 990 square feet (25 to 84 square meters).

The precast rings can be transported on flatbed trailers and assembled into a unit in a single day, drastically reducing on-site labor, time and cost.

Since they’re built using durable concrete forms, the house can be easily mass-produced. When precast concrete homes are mass-produced, the cost can be competitive with traditional wood-framed homes. Furthermore, the homes are designed to last far beyond 100 years – much longer than typical wood structures – while significantly lowering utility bills, maintenance expenses and insurance premiums.

The project is also envisioned as an open-source design. This means that the molds – which are expensive – are available for any precast producer to use and modify.

A computer graphic showing a prototype of a small, concrete home.
The Compact House is made using ultra-high-performance concrete.
Pablo Moyano Fernández, CC BY-SA

Leveraging a network that’s already in place

Two key limitations of precast concrete construction are the size and weight of the components and the distance to the project site.

Precast elements must comply with standard transportation regulations, which impose restrictions on both size and weight in order to pass under bridges and prevent road damage. As a result, components are typically limited to dimensions that can be safely and legally transported by truck. Each of the Compact House’s pieces are small enough to be transported in standard trailers.

Additionally, transportation costs become a major factor beyond a certain range. In general, the practical delivery radius from a precast plant to a construction site is 500 miles (805 kilometers). Anything beyond that becomes economically unfeasible.

However, the infrastructure to build precast concrete homes is already largely in place. Since precast concrete is often used for office buildings, schools, parking complexes and large apartments buildings, there’s already an extensive national network of manufacturing plants capable of producing and delivering components within that 500-mile radius.

There are other approaches to build homes with concrete: Homes can use concrete masonry units, which are similar to cinder blocks. This is a common technique around the world. Insulated concrete forms involve rigid foam blocks that are stacked like Lego bricks and are then filled with poured concrete, creating a structure with built-in insulation. And there’s even 3D-printed concrete, a rapidly evolving technology that is in its early stages of development.

However, none of these use precast concrete modules – the rings in my prototypes – and therefore require substantially longer on-site time and labor.

To me, precast concrete homes offer a compelling vision for the future of affordable housing. They signal a generational shift away from short-term construction and toward long-term value – redefining what it means to build for resilience, efficiency and equity in housing.

A bird's-eye view of a computer-generated neighborhood featuring plots of land with multiple concrete homes located on them.
An image of North St. Louis, taken from Google Earth, showing how vacant land can be repurposed using precast concrete homes.
Pablo Moyano Fernández, CC BY-SA

This article is part of a series centered on envisioning ways to deal with the housing crisis.

The Conversation

Pablo Moyano Fernández does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. To spur the construction of affordable, resilient homes, the future is concrete – https://theconversation.com/to-spur-the-construction-of-affordable-resilient-homes-the-future-is-concrete-254561

3D-printed model of a 500-year-old prosthetic hand hints at life of a Renaissance amputee

Source: – By Heidi Hausse, Associate Professor of History, Auburn University

Technology is more than just mechanisms and design — it’s ultimately about people.
Adriene Simon/College of Liberal Arts, Auburn University, CC BY-SA

To think about an artificial limb is to think about a person. It’s an object of touch and motion made to be used, one that attaches to the body and interacts with its user’s world.

Historical artifacts of prosthetic limbs are far removed from this lived context. Their users are gone. They are damaged – deteriorated by time and exposure to the elements. They are motionless, kept on display or in museum storage.

Yet, such artifacts are rare direct sources into the lives of historical amputees. We focus on the tools amputees used in 16th- and 17th-century Europe. There are few records written from amputees’ perspectives at that time, and those that exist say little about what everyday life with a prosthesis was like.

Engineering offers historians new tools to examine physical evidence. This is particularly important for the study of early modern mechanical hands, a new kind of prosthetic technology that appeared at the turn of the 16th century. Most of the artifacts are of unknown provenance. Many work only partially and some not at all. Their practical functions remain a mystery.

But computer-aided design software can help scholars reconstruct the artifacts’ internal mechanisms. This, in turn, helps us understand how the objects once moved.

Even more exciting, 3D printing lets scholars create physical models. Rather than imagining how a Renaissance prosthesis worked, scholars can physically test one. It’s a form of investigation that opens new possibilities for exploring the development of prosthetic technology and user experience through the centuries. It creates a trail of breadcrumbs that can bring us closer to the everyday experiences of premodern amputees.

But what does this work, which brings together two very different fields, look like in action?

What follows is a glimpse into our experience of collaboration on a team of historians and engineers, told through the story of one week. Working together, we shared a model of a 16th-century prosthesis with the public and learned a lesson about humans and technology in the process.

A historian encounters a broken model

THE HISTORIAN: On a cloudy day in late March, I walked into the University of Alabama Birmingham’s Center for Teaching and Learning holding a weatherproof case and brimming with excitement. Nestled within the case’s foam inserts was a functioning 3D-printed model of a 500-year-old prosthetic hand.

Fifteen minutes later, it broke.

Mechanical hand with plastic orange fingers extending from a plastic gray palm and wrist
This 3D-printed model of a 16th-century hand prosthesis has working mechanisms.
Heidi Hausse, CC BY-SA

For two years, my team of historians and engineers at Auburn University had worked tirelessly to turn an idea – recreating the mechanisms of a 16th-century artifact from Germany – into reality. The original iron prosthesis, the Kassel Hand, is one of approximately 35 from Renaissance Europe known today.

As an early modern historian who studies these artifacts, I work with a mechanical engineer, Chad Rose, to find new ways to explore them. The Kassel Hand is our case study. Our goal is to learn more about the life of the unknown person who used this artifact 500 years ago.

Using 3D-printed models, we’ve run experiments to test what kinds of activities its user could have performed with it. We modeled in inexpensive polylactic acid – plastic – to make this fragile artifact accessible to anyone with a consumer-grade 3D printer. But before sharing our files with the public, we needed to see how the model fared when others handled it.

An invitation to guest lecture on our experiments in Birmingham was our opportunity to do just that.

We brought two models. The main release lever broke first in one and then the other. This lever has an interior triangular plate connected to a thin rod that juts out of the wrist like a trigger. After pressing the fingers into a locked position, pulling the trigger is the only way to free them. If it breaks, the fingers become stuck.

Close-up of the interior mechanism of a 3D-printed prosthetic, the broken lever raised straight up
The thin rod of the main release lever snapped in this model.
Heidi Hausse, CC BY-SA

I was baffled. During testing, the model had lifted a 20-pound simulation of a chest lid by its fingertips. Yet, the first time we shared it with a general audience, a mechanism that had never broken in testing simply snapped.

Was it a printing error? Material defect? Design flaw?

We consulted our Hand Whisperer: our lead student engineer whose feel for how the model works appears at times preternatural.

An engineer becomes a hand whisperer

THE ENGINEER: I was sitting at my desk in Auburn’s mechanical engineering 3D print lab when I heard the news.

As a mechanical engineering graduate student concentrating on additive manufacturing, commonly known as 3D printing, I explore how to use this technology to reconstruct historical mechanisms. Over the two years I’ve worked on this project, I’ve come to know the Kassel Hand model well. As we fine-tuned designs, I’ve created and edited its computer-aided design files – the digital 3D constructions of the model – and printed and assembled its parts countless times.

Computer illustration of open hand model
This view of the computer-aided design file of a strengthened version of the model, which includes ribs and fillets to reinforce the plastic material, highlights the main release lever in orange.
Peden Jones, CC BY-SA

Examining parts midassembly is a crucial checkpoint for our prototypes. This quality control catches, corrects and prevents any defects, such as misprinted or damaged parts. It’s crucial for creating consistent and repeatable experiments. A new model version or component change never leaves the lab without passing rigorous inspection. This process means there are ways this model has behaved over time that the rest of the team has never seen. But I have.

So when I heard the release lever had broken in Birmingham, it was just another Thursday. While it had never snapped when we tested the model on people, I’d seen it break plenty of times while performing checks on components.

Disassembled hand model
Our model reconstructs the Kassel Hand’s original metal mechanisms in plastic.
Heidi Hausse, CC BY-SA

After all, the model is made from relatively weak polylactic acid. Perhaps the most difficult part of our work is making a plastic model as durable as possible while keeping it visually consistent with the 500-year-old original. The iron rod of the artifact’s lever can handle more force than our plastic version, at least five times the yield strength.

I suspected the lever had snapped because people pulled the trigger too far back and too quickly. The challenge, then, was to prevent this. But redesigning the lever to be thicker or a different shape would make it less like the historical artifact.

This raised the question: Why could I use the model without breaking the lever, but no one else could?

The team makes a plan

THE TEAM: A flurry of discussion led to growing consensus – the crux of the issue was not the model, it was the user.

The original Kassel Hand’s wearer would have learned to use their prosthesis through practice. Likewise, our team had learned to use the model over time. Through the process of design and development, prototyping and printing, we were inadvertently practicing how to operate it.

We needed to teach others to do the same. And this called for a two-pronged approach.

Perspective on using the Kassel Hand, as a modern prosthetist.

The engineers reexamined the opening through which the release trigger poked out of the model. They proposed shortening it to limit how far back users could pull it. When we checked how this change would affect the model’s accuracy, we found that a smaller opening was actually closer to the artifact’s dimensions. While the larger opening had been necessary for an earlier version of the release lever that needed to travel farther, now it only caused problems. The engineers got to work.

The historians, meanwhile, created plans to document and share the various techniques to operating the model the team hadn’t realized it had honed. To teach someone at home how to operate their own copy, we filmed a short video explaining how to lock and release the fingers and troubleshoot when a finger sticks.

Testing the plan

Exactly one week after what we called “the Birmingham Break,” we shared the model with a general audience again. This time we visited a colleague’s history class at Auburn.

We brought four copies. Each had an insert to shorten the opening around the trigger. First, we played our new instructional video on a projector. Then we turned the models over to the students to try.

Four mechanical hand models on display, each slightly different in design
The team brought these four models with inserts to shorten the opening below the release trigger to test with a general audience of undergraduate and graduate students.
Heidi Hausse, CC BY-SA

The result? Not a single broken lever. We publicly launched the project on schedule.

The process of introducing the Kassel Hand model to the public highlights that just as the 16th-century amputee who wore the artifact had to learn to use it, one must learn to use the 3D-printed model, too.

It is a potent reminder that technology is not just a matter of mechanisms and design. It is fundamentally about people – and how people use it.

The Conversation

Heidi Hausse received funding from the Herzog August Bibliothek; the Consortium for History of Science, Technology and Medicine; the American Council of Learned Societies; the Huntington Library; the Society of Fellows in the Humanities at Columbia University; and the Renaissance Society of America.

Peden Jones received funding from Renaissance Society of America.

ref. 3D-printed model of a 500-year-old prosthetic hand hints at life of a Renaissance amputee – https://theconversation.com/3d-printed-model-of-a-500-year-old-prosthetic-hand-hints-at-life-of-a-renaissance-amputee-256670