At dawn on October 28, residents of Rio de Janeiro woke to the sound of gunfire. Battles continued throughout the day in the favelas of Alemão and Penha, as police mounted a huge operation targeting the Commando Vermelho, or the Red Command, one of Brazil’s largest organised criminal gangs.
In the days that followed, as graphic images showed lines of bodies on the streets, it emerged that at least 115 civilians and four police officers had been killed, making it the most violent police operation in Brazilian history.
A poll carried out two days after the raid indicated that 62% of Rio residents supported the raid – rising to 88% in the favelas. But there were also protests against alleged extrajudicial killings and condemnation by the UN and other human rights organisations.
The violent operation overshadowed the start of the Cop30 climate summit in Belem on the edge of the Amazon. At a press conference upon his arrival in Belem, Brazil’s President Luiz Inácio Lula da Silva, who was not aware of the operation beforehand, condemned the raid as “diastrous” and a “mass killing”.
In this episode of The Conversation Weekly podcast, we speak to Robert Muggah, founder of the Institute Igarapé and a research collaborator at the Brazil LAB at Princeton University, about how organised crime become so deeply embedded in Brazil – and if there’s a better way to confront it.
The origins of the Red Command lie in Brazilian prisons during the years of Brazil’s military dictatorship in the 1970s. “ The authorities at the time often would crowd common criminals together with left-wing political prisoners in the same jails,” explains Muggah. “ You had this almost metastasis happening between these different inmates and … an alliance emerged from these two groups called the falange vermelha, which means the red phalanx.”
Incubated in the prison system, the gang moved out in the street, shedding its left-wing ties as the dictatorship ended. “By the 1980s, you have a fairly well-organised group which is diversifying its income streams from what was typically bank robberies or targeted raids, to the cocaine economy,” Muggah says.
Today, the Red Command has expanded out of Rio and is present across Brazil and in neighbouring countries. “What you’ve seen over the past decade in particular is the penetration of organized crime, not into just new geographic areas, but entirely new sectors of the economy,” says Muggah.
Listen to the interview with Robert Muggah on The Conversation Weekly podcast, and read an article he wrote in Portuguese on the October 28 operation against the Red Command.
This episode of The Conversation Weekly was written and produced by Katie Flood, Mend Mariwany and Gemma Ware. Mixing by Eleanor Brezzi and theme music by Neeta Sarl.
Listen to The Conversation Weekly via any of the apps listed above, download it directly via our RSS feed or find out how else to listen here. A transcript of this episode is available on Apple Podcasts or Spotify.
Robert Muggah is the co-founder of the Igarape Institute, a think and do tank in Brazil and a principal and co-founder of SecDev, a geopolitical and digital advisory group.
Source: The Conversation – Africa – By Nnamdi O. Madichie, Professor of Marketing & Entrepreneurship, Unizik Business School, Nnamdi Azikiwe University
Short comedy videos circulating on social media have created a booming industry in Nigeria in the past few years. The country’s comedy creators put their skits out on platforms like YouTube, TikTok and Instagram to reach a massive audience.
As these online comedians gain followers they make their money from advertising, by endorsing brands as influencers, and through collaborations. In Nigeria the industry is popularly called the skit economy.
This phenomenon represents more than a major new entertainment trend. It highlights the ingenuity of young Nigerians in using technology to create livelihoods and influence culture. In the process, they contribute to national economic growth.
The skit industry has joined the likes of Nollywood film, Afrobeats music and local fashion to put the country on the entertainment map globally.
The rise of the industry is chronicled in the 2024 book Skit Economy: How Nigeria’s Comedy Skit-Makers Are Redefining Africa’s Digital Content Landscape, by entrepreneurship scholar and polling guru Bell Ihua. His work is supported by findings from the Africa Polling Institute.
As he explains:
The Nigerian entertainment industry is undoubtedly creating job opportunities and contributing to the country’s diversification from oil … The industry is rated as the second most significant employer of youths in Nigeria after agriculture, employing over one million people.
According to his book, skit-making is estimated to be Nigeria’s third largest entertainment industry sector, with a net worth of over US$31 million.
What becomes clear as you read it is that social media platforms have not only amplified the reach and impact of skits. Online platforms have allowed creators to reach global audiences while preserving the culture, language and stories unique to their communities. Skit creators prove the potential of comedy as a medium for both entertainment and cultural diplomacy.
However, as the industry grows, argues Ihua, the skit economy must navigate new challenges related to representation and ethics.
What’s in the book
The book’s eight chapters cover Africa’s digital content landscape, taking into account the continent’s youth bulge and the evolution of social media and content creation.
Ihua then explores Nigeria’s booming cultural and creative industries before homing in on comedy skit-making in chapter 4. It attempts to classify various types of digital content creation in Nigeria and outline the trends in online videos before embarking on an in-depth national study on comedy skit-making in chapter 7. He then considers implications for public policy and future research in the field.
What makes the book so compelling is that it recognises skit-making as an ecosystem on its own terms. It then defines what that ecosystem looks like in Nigeria. In the process Ihua makes it clear why books like this matter.
They are a call for taking entertainment seriously and investing future research in it. Social media and digital technology have reconfigured an unsung economic sector that’s capable of including the bulging youth population in the national conversation. This is despite limited institutional support.
What’s driving the boom
Ihua traces its boom to COVID-19 lockdowns that began in Nigeria in 2020:
They provided a source of laughter and relief to many Nigerians, as most people found it safer to stay at home and get entertained with skits.
Today, writes Ihua, two-thirds of Nigerians watch comedy skits frequently. According to his study they serve as stress relief and social commentary.
With 63% of Nigerians under 25 and high social media uptake, skit-making taps into abundant creative energy and mobile-first audiences.
Value
The Skit-Economy highlights how skit comedians create direct and indirect jobs (editors, social media managers, brand consultants). They generate income through endorsements, platform monetisation (the revenue they get from advertising on a space like YouTube), and various partnerships and collaborations.
Their cultural value is not just measured in their global influence. Skits reflect everyday Nigerian realities with humour and satire, influencing local public opinion and reinforcing national identity.
As prominent Nigerian entrepreneur and cultural worker Obi Asika notes in the book’s foreword:
Their success … stems from a combination of talent, creativity, innovation, an entrepreneurial spirit, and a deep understanding of their audience’s preferences and cultural nuances.
Challenges
However, Ihua identifies a number of challenges facing the industry.
Financial rewards are unequal. Only top creators earn sustainably. For many skit-makers revenue is unstable.
Working from Nigeria means dealing with infrastructure deficits. Electricity supply is unreliable, the internet is expensive and there is limited access to digital production tools.
Nigerian skit-makers also operate in a climate where there are weak intellectual property protections. Piracy and unauthorised reuse undermine earnings.
The job can be an ethical minefield. Pranks can be harmful. They can perpetuate stereotypes and be insensitive to minorities.
These challenges are enhanced by a policy vacuum. There is little government recognition or support for digital creatives in Nigeria.
An African future?
For Ihua, skit-making is a good example of how new digital industries can aid in absorbing Africa’s growing youth workforce. With adequate support, skit-making can help provide dignified livelihoods.
So, for Ihua these creators are not merely entertainers. They’re also job creators, cultural ambassadors, and catalysts of digital transformation.
For Africa broadly, the rise of skit-making underscores the continent’s potential to innovate in ways that are uniquely aligned with its youthful demographics and digital future.
Nigeria’s skit economy offers a blueprint for the continent. Already, skit-making is spreading to other countries, like Ghana, Kenya and South Africa. The lines are blurring between stand-up or TV comedians and skit makers.
If nurtured with the right infrastructure, policy, and industry support, the skit economy could evolve from an informal hustle into a structured pillar of Africa’s creative economy. This could further solidify the continent’s role in the global cultural imagination.
Nnamdi O. Madichie does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
A chapbook – a small publication usually under 40 pages – is an accessible and honoured format for poets to publish focused selections of their work. In this series, each chapbook features an emerging African poet, and is presented as part of a beautifully designed box set of 10 or more chapbooks. Besides the poetry itself, each box set also showcases the work of a commissioned African visual artist. The artists include Sokari Douglas Camp, Victor Ehikhamenor, Ficre Ghebreyesus and Aida Muluneh, among others.
This ever-growing archive has now published over 100 poets, and offers a window into the diversity of African poetic expression today.
Marking the project’s 10th anniversary is a new anthology called Toward a Living Archive of African Poetry, edited by Jordanian writer Siwar Masannat. It collects Dawes and Abani’s rich introductions to each box set and has a foreword by Masannat. In it, readers learn about the impact of the series, offering a layered and necessary account of how these chapbooks have transformed the visibility of African poets over the past decade.
My work as a scholar of African literature focuses on recovering overlooked histories and interrogating the spaces in which literature is made and circulated.
This new anthology matters because it documents not just poems, but a cultural movement that redefines what an African literary archive can be, and why poetry remains central to that conversation.
Decidedly diasporic
While the series places Africa at the centre of its imagination, its focus is largely diasporic, shaped by Africans living outside the continent. The majority of the poets live in the US or the UK. Poets based on the continent form a minority and are scattered geographically.
The editors acknowledge this imbalance, attributing it to “better access to workshops and craft education” available to diaspora poets. The result is an archive arguably shaped less by the immediacies of the continent and more by the diaspora’s sensibilities and infrastructures.
Nigeria, more than any of the 25-odd countries included in the chapbooks, shapes the aesthetics of the series. This reflects both the density of the country’s literary networks and the curatorial choices of the editors. They rely heavily on personal connections and prize pools to spot new and emerging talents.
A recurring feature of the poets in the series is the “hyphenated African”: Somali-American, Ghanaian-British, Ethiopian-German, Sierra Leonean-American. Some were born in countries outside Africa or migrated as toddlers. Their Africanness is claimed through memory, nostalgia, heritage, or family history, rather than geography.
The editors assert that all the poets “self-identify as Africans in the full and complicated way that Africanness is best defined”. This also underscores how the project expands the category of African poetry.
In fact, the transcontinental profile of these writers shows how African poetry today cannot be read solely through a nationalist lens. The hybridity of identity and place becomes central. Many poets occupy in-between spaces – culturally, geographically, linguistically and emotionally.
Still, the series impresses on many other levels. Particularly in its commitment to highlighting the continent’s plural and localised poetics, and in its rare, long-term investment in the future of African poetry.
This signals an important feminist turn in African poetics. The chapbook form becomes a space where African women’s voices are nurtured and given international circulation, countering historical silences. The poets here highlight a generational continuity of feminist expression.
Intergenerational
The birth years of poets in the series range from 1963 to 2007, showcasing a vibrant intergenerational dialogue. The older poets often engage in socio-political critique informed by post-independence transitions. Millennial and Gen Z poets frequently explore themes of identity, queerness, internet culture, displacement and decoloniality with linguistic experimentation and digital fluency.
Ghanaian poet Tryphena Yeboah, in her chapbook, A Mouthful of Home, exemplifies this:
I TELL MY MOTHER I WANT A BODY THAT
EXPANDS
Into a map. She wants to know where I’ll travel to. I say
“myself”.
The act of travel becomes a metaphor for self-mapping that captures how younger African poets reimagine movement, belonging and home as internal, affective geographies.
In contrast, South African poet Ashley Makue, in her chapbook, i know how to fix myself, offers a more visceral expression of embodied trauma and inherited violence:
my mother is a war zone
they don’t tell her that
these men that pee in her
and leave with gunpowder in their chests
Living archive
The New Generation African Poets Chapbook Series has been an extraordinary intervention in the history of African poetry. It has foregrounded a generation, opened an aesthetic safe space, and created a beautiful, living archive.
Dawes and Abani introduce each of the box sets with two introductions – what they call “simultaneous conversations” – and they often debate identity, the style of the poetry, circulation, and other issues.
This is more than an impressive catalogue; it is a breathing archive of African poetic consciousness, one that resists static definitions. It captures the fluidity of identity, the urgency of voice, and the diverse shaping of African poetry today.
What it tells us: that African poetry is thriving, diverse and globally mobile. What it does not tell us: how poets working entirely from the continent might imagine and enact African poetics differently.
But by foregrounding new and emerging voices, the Africa Poetry Book Fund affirms that poets remain vital chroniclers of the African experience, articulating emotion, history and imagination in ways that other forms of writing often cannot.
Tinashe Mushakavanhu does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
The water infrastructure politics of eThekwini, the municipality that includes the city of Durban, have been splashed across the digital pages of South Africa’s news outlets in recent years.
The city’s water politics has a long history. Some of the infrastructure issues can be traced back to the mid-1800s, when it was a British imperial port.
I’m a historian with an interest in coastal communities and urban life. As part of my work on water as a public health concern in colonial cities, I spent months in the Durban Archives Repository, going through correspondence, reports, business contracts, newspaper clippings and town council minutes.
The records revealed how the system of colonial-era water infrastructure worked – and for whom.
The first water technologies in Durban were British-styled wells. Anyone could use them, for free. They brought people of different origins and class together for practical purposes but also created anxiety about social difference. For colonial officials, the public had to follow British standards or lose access to the infrastructure altogether. They created Durban’s first water-policing system, purportedly for better public health and conservation. While wealthier and white people eventually came to rely on piped water, poorer and black (Zulu and Indian) people were excluded.
This system formed the basis for the uneven access to water that today’s residents experience. People still depend on private water infrastructure as the municipal system struggles.
Nineteenth-century infrastructure
Founded by British traders as Port Natal in 1824, the colonial borough of Durban depended on stand-alone water infrastructures from the beginning. Brick and cement wells were the first technologies from which residents drew water, since they were easy to build and maintain. Most wells had either a bucket or a pump attached to them. Pumps attached to wells became common after the borough made most wells publicly available in the mid-1850s.
Water tanks, on the other hand, were private technologies which mainly lay underground. Only wealthier households and businesses could afford to build them. They became prominent in the 1870s.
It’s hard to know exactly how many of these infrastructures existed in total. By the 1870s, though, official reports indicate that about 18 public wells and pumps across the town served the bulk of the town’s approximately 20,000 inhabitants.
Piped water came to Durban in the 1880s, supplied initially by the spring at Curries Fountain. In 1889, the city’s laws were extended to cover private tanks that were filled from the municipal pipes. Even so, much of the population still relied on standalone infrastructures for water supplies.
As time went by, conflicts began to brew. The rising population placed a strain on these stand-alone infrastructures, which offered varying amounts of water depending on rainfall patterns. Arguments sparked when a community drew too much water or polluted a well, creating a local water scarcity.
Clashes and restrictions
White colonists blamed much of the water scarcity and contamination on African labourers who worked as household or business servants, sanitary workers and launderers. These positions demanded a close relationship with fresh water collection and use, which meant African labourers became the main users of wells, pumps and tanks.
Labourers did not always use water technologies according to colonial expectations, however. Local people were accustomed to using open water sources like rivers and streams, not restrictive iron and brick infrastructures. So, they modified their traditional work at open sources, like washing objects and produce, to the new technologies they had to use.
That sometimes created problems, according to the archive records. They accidentally broke handles and chains when pumping too quickly. They drew water from tanks without using a filter, which was officially perceived as a disease risk. They publicly washed clothing, bodies and food at wells, where the dirty wash water flowed back into the enclosed water supply.
Colonists exploited this situation to place restrictions on how labourers could use stand-alone water infrastructures. Borough officials crafted new laws that forced colonised residents to conform with British standards. They punished those who did not comply with fines, verbal lashings and even jail time.
Durban was part of a colonial system predicated on white supremacy. The government sought to maintain segregation between white colonists and African and South Asian residents. So, it imbued its water technology regulations with the notion that some water management actions – British – were “healthier” than others, namely African and South Asian. If someone used a technology contrary to British standards, then they faced restricted access to public technologies and the water they provided.
Water system legacy
Stand-alone water infrastructures still exist across eThekwini. Many residents of informal settlements and formerly racially segregated areas remain officially unconnected with municipal pipes. They instead depend on local wells, pumps and illegal individualised connections. An increasing number of households are investing in water tanks as the municipal water system becomes more unreliable.
Things have, of course, changed since the 19th century. However, the municipality continues to require residents to use these technologies within regulatory boundaries if residents want to maintain access to them. Cutting off municipal water supply to private storage tanks is an example.
Infrastructural stopgaps further expose a water system that was never meant to supply every resident equitably and without restriction. These actions tell us that today’s officials have inherited and inadvertently continue a water system that was meant to exclude more than include, to punish more than teach, to restrict more than provide.
Kristin Brig receives funding from the US Fulbright Program, the US National Science Foundation (NSF), and Johns Hopkins University.
With more than 85 million people naming it their top choice, Canada has become one of the most desired migration destinations in the world over the past decade.
Yet even in 2024, its highest year on record, Canada only admitted about 480,000 new permanent residents, a small fraction of global demand.
The challenge, however, is not how few people get in; it is how unpredictable the system has become.
Admissions of permanent residents by year (1980-2027) (Immigration, Refugees and Citizenship Canada)
A shifting framework
In June 2022, the federal government amended the Immigration and Refugee Protection Act to give itself more flexibility.
It rolled out a new immigration stream to prioritize in-demand occupations in health care, engineering and agriculture, as well as French-speaking applicants.
In the earlier system, fixed points for education and high-skilled work experience provided applicants with a clear way to assess their eligibility. In contrast, the new category-based approach relies on occupational needs that shift rapidly.
The goal was to respond quickly to labour shortages and economic goals by consulting with provinces, industries, labour groups and the public. However, this category-based selection has been rolled out with little consistency or transparency. Announcements come with no clear timelines, fixed numbers or indication of when a stream might close.
In this new framework, broad categories such as health care or STEM (science, technology, engineering and mathematics) encompass hundreds of distinct occupations. Yet the government may single out only a handful of these occupations for invitations while excluding the rest, which makes outcomes unpredictable even within the announced priority categories.
Migration is a long-term project
What this changing immigration policy fails to consider is that immigration is not an instant decision, but a long-term project.
My research shows that people may spend more than a decade preparing for migration by carefully choosing a field of study, seeking related work experience, saving aggressively and even reshaping their personal lives. Some even avoid intimate relationships or postpone having children in hopes of migrating. However, those plans fall apart when the qualifying requirements change quickly.
The uncertainty created by shifting immigration policies is not felt only abroad. Within Canada, roughly three million people are on temporary permits, and many of them are hoping for a chance at permanent residency. They spend years establishing roots in their communities, with the belief that it will ultimately lead to a more secure future. But when policy priorities change unexpectedly, their lives are thrown into limbo.
Yet, as the rules change, they may find themselves with no option to stay in Canada once their studies end. Similarly, temporary foreign workers may fill urgent labour shortages, only to see pathways to permanence narrow or close before they can apply.
A problem for everyone
Quick and unpredictable changes in rules make immigration seem like a lottery rather than a structured system. Success now often depends not on careful planning or merit, but on being in the right place at the right time.
The lottery effect erodes confidence in Canada’s immigration policy. It conveys the idea that long-term planning and investment might not be essential and that today’s standards might change tomorrow.
Uncertainty also fuels a darker consequence: fraud.
Consequently, the political climate has shifted toward risk-averse immigration policies that focus on immediate results instead of developing sustainable approaches.
A more sustainable system
Immigration is essential to Canada’s future because it sustains the workforce as the population ages, with nearly all of Canada’s labour force growth now coming from newcomers.
Meanwhile, many temporary residents who have studied, worked in highly skilled jobs and paid taxes for years are ineligible to apply for permanent status because their occupations are not on the list. They end up leaving despite their contributions.
The immigration system should include defined criteria, realistic deadlines and transparent information that lets people inside and outside Canada plan with confidence. Consistency is crucial.
A more sustainable approach would connect permanent residency more closely to proven success in the Canadian labour market. At the end of the day, immigration should be based on preparation, abilities and dedication — certainly not on luck.
Omid Asayesh does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Solid components such as used grains and yeast from this waste end up in landfills, where harmful compounds can leach into the soil. Brewing wastewater that makes it into aquatic ecosystems can contaminate streams and lakes, decrease oxygen levels in those environments and threaten organisms.
To keep this waste from going into the environment, scientists like me are exploring how to manufacture beer brewing waste into useful products. I’m a chemist, and my research team and I are interested in figuring out how to recycle and repurpose brewery waste into tiny particles that can be used to make new types of prescription drugs.
The brewing process
The brewing process takes raw cereal grain – usually from barley – and converts its starch and proteins into simpler chemicals by malting. Brewers initiate this process by adding water, which wakes the seed from dormancy, and then keeping the seeds at a controlled temperature to sprout the grain.
During this time, important enzymes are released that can convert the starch and proteins in the grains to fermentable sugars and amino acids. They then heat up the resulting product, called the malt, to dry it out and stop further sprouting. After this malting process, they add hot water and mash the malt to release the compounds that give the beer its iconic flavor.
The brewing process produces waste at four main stages. Alcina Johnson Sudagar, CC BY-SA
The brewers then separate the sweet malt extract, called wort, and the leftover solid is removed as waste, called brewer’s spent grains. About 30% of the weight of the raw grain ends up as spent grain waste. This waste is either used as animal feed or discarded. About 30 million tons of spent grain is generated annually.
Brewers add a cone-shaped flower of the Humulus lupulus plant, called hops, to the wort, then boil and clarify it. The hops flower is the key ingredient that gives beer its bitterness and aroma. The undissolved hops and proteins get collected during clarification to form hot trub, the second major waste from breweries. Roughly 85% of the hops are removed as waste material.
The clear wort is then cooled and fermented by adding yeast. The yeast filtered out after fermentation, called brewer’s spent yeast, forms the third type of waste that breweries generate. The spent yeast is one of the major byproducts of the brewing industry. This waste has a large quantity of water and solid material: 100 liters of beer generate 2 to 4 kilograms (4.4 to 8.8 lbs.) of spent yeast.
Finally, the fermented beer is filtered before entering the production line, where the beer is bottled for consumption. The wastewater generated at this last stage forms the filtration waste. A medium-size brewery generates about 8 tons of dense sludge and five to seven times – or 40 to 56 tons – of wastewater as filtration waste monthly. Several tons of waste from breweries remain largely underused due to their low economic value.
The brewery waste problem
These wastes have several compounds, such as carbohydrates, proteins, amino acids, minerals and vitamins that can potentially be repurposed. Scientists have tried to reuse the wastes in creative ways by creating biofuels and vegan leather using either some compounds extracted from the waste or the entire waste.
Breweries can send their solid wastes to farms that repurpose it as soil fertilizer, compost or animal feed, but a major fraction of it industrywide is discarded as landfill. The wastewater is discharged into the sewage lines, which can challenge sewage treatment systems, as they contain more than 30 times higher pollutants than the typical residential sewage.
Although breweries are becoming more aware of their waste and moving toward sustainable approaches, demand for beer has continued to rise, and a large amount of waste remains to be dealt with.
Repurposing waste in nanoparticles
In my research, I’m interested in determining whether compounds from brewery waste can help create nanoparticles that are compatible with human cells but fight against bacteria. Nanoparticles are extremely tiny particles that have sizes in the range of one-billionth of a meter.
Nanoparticles are smaller than bacteria – they can be the size of viruses or even human DNA. Alcina Johnson Sudagar, CC BY-SA
My team and I developed nanoparticles coated with some of the compounds found in brewery waste – an invention which we have since patented but are not actively commercializing. We created the particles by adding waste from any stage of brewing to a metal source.
When we added a chemical containing silver – for example, silver nitrate – to the waste, a combination of processes converted silver compound into nanoparticles. One process is called reduction: Here, compounds found in the brewery waste undergo a chemical reaction that converts the silver ions from the silver nitrate to a metallic nanoparticle.
The other process, called precipitation, is similar to how chalky soap scum forms in your sink when soap reacts with minerals such as calcium in hard water. Oxide and phosphate from the brewery waste combine with a silver ion from the silver nitrate, causing the silver to form a solid compound that makes up the nanoparticle’s core.
The organic compounds from the brewing waste such as proteins, carbohydrates, polyphenols and sugars form a coating on the nanoparticles. This coating prevents any other reaction from happening on the surface of these particles, which is very important for making the nanoparticles stable for their applications. These nanoparticles prepared from brewery waste were made of three components: silver metal, silver oxide and silver phosphate.
Nanoparticles preparation using one-pot method. Alcina Johnson Sudagar, CC BY-SA
Environmentally friendly processes that reduce the use of hazardous chemicals and minimize harmful side products are known as green chemistry. Because our procedure was so simple and did not use any other chemicals, it falls into this green chemistry category.
Nanoparticle safety
My colleague Neha Rangam found that the coating formed by the brewery waste compounds makes these nanoparticles nontoxic to human cells in the lab. However, the silver from these nanoparticles killed Escherichia coli, a common bacterium responsible for intestinal illness around the world.
We found that a special type of nanoparticle containing high amounts of silver phosphate worked against E. coli. It appeared that this silver phosphate nanoparticle had a thinner coating of the organic compounds from the brewery waste than silver metal and oxides, which led to better contact with the bacteria. That meant enough silver could reach the bacteria to disrupt its cellular structure. Silver has long been known to have an antimicrobial effect. By creating nanoparticles from silver, we get lots of surface area available for eliminating bacteria.
Several nanoparticles have been in clinical trials and some have been FDA approved for use in drugs for pain management, dental treatment and diseases such as cancer and COVID-19. Most research into nanoparticles in biotechnology has dealt with carbon-based nanoparticles. Scientists still need to see how these metal nanoparticles would interact with the human body and whether they could potentially cause other health problems.
Because they’re so tiny, these particles are difficult to remove from the body unless they are attached to drug carriers designed to transport the nanoparticles safely. Before doctors can use these nanoparticles as antibacterial drugs, scientists will need to study the fate of these materials once they enter the body.
Some engineered nanoparticles can be toxic to living organisms, so research will need to address whether these brewery waste-derived nanoparticles are safe for the human body before they’re used as a new antibacterial drug component.
Alcina Johnson Sudagar received funding from the European Union’s Marie Curie Horizon 2020 program for this work. Part of the work has been patented, Polish patent valid since August 2020 (Patent no: P.435084)
Trump’s demands that Obama release his birth certificate had, in part, made Trump a front-runner among Republican hopefuls for their party’s nomination in the following year’s presidential election.
Obama referred to Trump’s presidential ambitions by joking that, if elected, Trump would bring some changes to the White House.
Obama then called attention to a satirical photo the guests could see of a remodeled White House with the words “Trump” and “The White House” in large purple letters followed by the words “hotel,” “casino” and “golf course.”
Obama’s ridicule of Trump that evening has been credited with inspiring Trump to run for president in 2016.
My book, “The Art of the Political Putdown,” includes Obama’s chiding of Trump at the correspondents’ dinner to demonstrate how politicians use humor to establish superiority over a rival.
Obama’s ridicule humiliated Trump, who temporarily dropped the birther conspiracy before reviving it. But Trump may have gotten the last laugh by using the humiliation of that night, as some think, as motivation in his run for the president in 2016.
There is a further twist to Obama joking about Trump’s renovations to the White House if Trump became president. Trump has fulfilled Obama’s prediction, kind of.
The Trump administration has razed the East Wing, which sits adjacent to the White House, and will replace it with a 90,000-square-foot, gold-encrusted ballroom that appears to reflect the ostentatious tastes of the president.
It’s expected to be big enough to accommodate nearly a thousand people. Design renderings suggest that the ballroom will resemble the ballroom at Mar-a-Lago, the president’s private estate in Palm Beach, Florida.
“I don’t have any plan to call it after myself,” Trump said recently. “That was fake news. Probably going to call it the presidential ballroom or something like that. We haven’t really thought about a name yet.”
But senior administration officials told ABC News that they were already referring to the structure as “The President Donald J. Trump Ballroom.”
The renovation will have neither a hotel, casino nor golf course, as Obama mentioned in his light-hearted speech at the 2011 correspondents’ dinner.
A video is shown as President Barack Obama speaks about Donald Trump at the White House Correspondents’ Association dinner in Washington on April 30, 2011. AP Photo/Manuel Balce Ceneta
Obama pokes fun at Trump
In the months before the 2011 correspondents’ dinner, Trump had repeatedly claimed that Obama had not been born in Hawaii but had instead been born outside the United States, perhaps in his father’s home country of Kenya.
The baseless conspiracy theory became such a distraction that Obama released his long-form birth certificate in April 2011.
Three days later, Obama delivered his speech at the correspondents’ dinner with Trump in the audience, where he said that Trump, having put the birther conspiracy behind him, could move to other conspiracy theories like claims the moon landing was staged, aliens landed in Roswell, New Mexico, or the unsolved murders of rappers Biggie Smalls and Tupac Shakur.
Obama then poked fun at Trump’s reality show, “The Apprentice,” and referred to how Trump, who owned hotels, casinos and golf courses, might renovate the White House.
When Obama was finished, Seth Meyers, the host of the dinner, made additional jokes at Trump’s expense.
“Donald Trump has been saying that he will run for president as a Republican – which is surprising, since I just assumed that he was running as a joke,” Meyers said.
Trump gets the last laugh
The New Yorker magazine writer Adam Gopnik remembered watching Trump as the jokes kept coming at his expense.
“Trump’s humiliation was as absolute, and as visible, as any I have ever seen: his head set in place, like a man on a pillory, he barely moved or altered his expression as wave after wave of laughter struck him,” Gopnik wrote. “There was not a trace of feigning good humor about him.”
Donald Trump and Melania Trump arrive for the White House correspondents’ dinner in Washington on April 30, 2011. AP Photo/Alex Brandon, File
Roger Stone, one of Trump’s top advisers, said Trump decided to run for president after he felt he had been publicly humiliated.
“I think that is the night he resolves to run for president,” Stone said in an interview with the PBS program “Frontline.” “I think that he is kind of motivated by it. ‘Maybe I’ll just run. Maybe I’ll show them all.‘”
Trump, if Stone and other political observers are correct, sought the presidency to avenge that humiliation.
“I thought, ‘Oh, Barack Obama is starting something that I don’t know if he’ll be able to finish,’” said Omarosa Manigault, a former “Apprentice” contestant who became Trump’s director of African American outreach during his first term.
“Every critic, every detractor, will have to bow down to President Trump,” she said. “It is everyone who’s ever doubted Donald, whoever disagreed, whoever challenged him – it is the ultimate revenge to become the most powerful man in the universe.”
The notoriously thin-skinned Trump did not attend the White House correspondents’ dinner during his first presidency. He also did not attend the dinner during the first year of his second presidency.
Although Trump has never publicly acknowledged the importance of that event in 2011, a number of people have noted how pivotal it was, demonstrating how the putdown can be a powerful weapon in politics – even, perhaps, extending to tearing down the White House’s East Wing.
Chris Lamb does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
U.S. Sen. Amy Klobuchar of Minnesota speaks at an oversight hearing before the Senate Judiciary Committee on Oct. 7, 2025. AP Photo/Allison Robbert
Routine congressional oversight hearings usually don’t make headlines. Historically, these often low-key events have been the sorts of things you catch only on C-SPAN – procedural, polite and largely ignored outside the Beltway.
But their tone has shifted dramatically during the second Trump administration.
When Attorney General Pam Bondi appeared before the Senate Judiciary Committee on Oct. 7, 2025, what took place was a contentious, highly partisan, made-for-TV-and-social-media confrontation.
The hearing occurred on the heels of the indictment of former FBI Director James Comey, which many legal experts view as an example of a president targeting his political enemies. Bondi came ready to fight. She refused to answer many questions from Democrats, instead launching personal attacks against these members of the U.S. Senate.
From our perspective as political scientists who study the U.S. Congress, congressional oversight has played an important role in American democracy. Here’s a brief history.
Congressional oversight hearings help keep executive branch agencies accountable to the public.
Inquisitory powers
In simple terms, oversight is the ability of Congress to ensure that the laws it passes are faithfully executed. This generally means asking questions, demanding information, convening hearings and holding the executive branch accountable for its actions.
Oversight isn’t specifically mentioned in the Constitution. Article 1, Section 8, which lists the powers of Congress, includes the power “to make all laws which shall be necessary and proper,” without identifying an oversight role. Once laws are enacted, Article 2, Section 3, states that the president “shall take Care that the Laws be faithfully executed.”
However, the framers viewed congressional oversight as a key component of legislative authority. They wanted presidents to take Congress seriously and structured the Constitution to ensure that the executive would be accountable to the legislature. As James Madison urged in Federalist 51, the separate branches of government should have the power to keep each other from becoming too powerful. “Ambition must be made to counteract ambition,” Madison wrote.
At the Federal Convention in 1787 that produced the Constitution, Delegate George Mason noted that members of Congress possessed “inquisitory powers” and “must meet frequently to inspect the Conduct of public officials.” Even though this idea was never written down, it was a habit of self-government that early Congresses put into practice.
Sen. Sam Ervin, chair of the Senate Watergate Committee, announces on July 23,1973, that the committee has decided to subpoena White House tapes and documents related to the Watergate burglary and cover-up. AP Photo
Early oversight hearings
Congressional oversight began almost as soon as the first Congress met. In 1790, Robert Morris, the superintendent of finances during the Continental Congress and a financier of the American Revolution, asked Congress to investigate his handling of the country’s finances and was exonerated of any wrongdoing.
During this period, congressional investigations were often referred to select committees – bodies created to perform special functions. These panels had the power to issue subpoenas and hold individuals in contempt. Since there was no official record of debates and proceedings, the public relied on newspaper accounts to learn about what had happened.
In March 1792, congressional oversight exposed businessman William Duer, who signed contracts with the War Department but failed to furnish the needed military supplies. This shortfall contributed to a stunning U.S. military defeat against a confederation of Native American tribes in the Northwest Territory.
Congress eventually removed the quartermaster general from his role for mismanaging the contracts. Duer was simultaneously involved in perhaps the first American economic bubble, which burst at the same time as Congress’ hearings. He ended up in a debtor’s prison, where he died in 1799.
Throughout the 19th century, Congress continued to quietly exercise this power. The work was often invisible to the public, but the issues were important. Hearings from December 1861 to May 1865 on the conduct of the U.S. Civil War produced a detailed record of the war, exposed military wrongdoing and condemned slavery. In 1871, the Senate created a select committee to investigate Ku Klux Klan violence during Reconstruction.
Investigating corruption and criminal acts
Congress started to use its oversight power more aggressively in the 1920s with the Senate Committee on Public Land and Surveys’ high-profile investigations into the Teapot Dome scandal.
Hearings revealed that Interior Secretary Albert Bacon Fall had secretly leased federal oil reserves in Wyoming to two private corporations and had received personal loans and gifts from the companies in return.
The investigation found clear evidence of corruption. Fall was indicted and became the first U.S. Cabinet member to be convicted of a felony.
The U.S. Supreme Court helped to shape the legal foundation of congressional oversight. In McGrain v. Daugherty, decided in 1927, the court held that congressional committees could issue subpoenas, force witnesses to testify and hold them in contempt if they fail to appear. Two years later, in Sinclair v. United States, the court ruled that witnesses who lied to Congress could be charged with perjury.
These cases granted the judicial branch’s sanction to what had long been an implied legislative power, cementing the constitutionality of congressional oversight.
Oversight highs and lows
The modern era of congressional oversight has produced some very important reforms – and some truly regrettable spectacles.
The most important example of bipartisan congressional oversight came in response to reporting by The Washington Post’s Carl Bernstein and Bob Woodward. The two journalists wrote about the 1972 burgling of Democratic National Committee offices in Washington, D.C.’s Watergate Hotel and subsequent cover-up efforts by the Nixon administration.
On Feb. 7, 1973, the U.S. Senate voted 77-0 to establish a Select Committee on Presidential Campaign Activities, which brought together Democrats and Republicans to investigate what came to be known as the “Watergate scandal.” The committee’s work spurred action in Congress to impeach President Richard Nixon, leading to Nixon’s resignation in 1974 and to the enactment of legal reforms to provide an institutional check on presidential power.
Another high point for congressional oversight came after the 9/11 terrorist attacks in 2001. Seeking to learn how the deadliest terrorist strike on American soil had occurred, Democratic Sen. Bob Graham and Republican Rep. Porter Goss, who chaired the Senate and House Intelligence committees, formed a joint committee to investigate intelligence failures before and after the attacks.
This inquiry produced several important recommendations that were ultimately adopted, including the creation of a director of national intelligence and a Department of Homeland Security, as well as better information sharing among law enforcement agencies.
After the Sept. 11, 2001, attacks on the World Trade Center in New York City, shown here, and targets in Washington, D.C., a congressional committee investigated intelligence failures that had impeded detection of the terrorist plot. Universal History Archive/UIG via Getty Images
Congress’ oversight can extend beyond the executive branch when the actions of private actors raise questions about existing laws or spur the need for new ones. As examples, investigations into medical device safety and Enron’s 2001 collapse examined malfeasance in the private sphere that existing regulations failed to prevent.
However, the power to expose corruption can also be used as a tool to score partisan points and generate outrage, rather than holding the executive branch accountable for actual malfeasance. Notably, in the 1950s, Wisconsin Sen. Joseph McCarthy turned oversight into inquisition and used the power of media to amplify his accusations of communist influence within the federal government.
Democracy needs oversight
Congressional oversight has strengthened the democratic system at many points. But hearings like Bondi’s recent session before the Senate Judiciary Committee aren’t the first, and likely won’t be the last, to substitute sound bites for substance.
As we see it, the problem with allowing oversight to become political theater is that it distracts Congress from quieter and more meaningful oversight work. Slow, procedural work isn’t likely to go viral, but it helps keep government accountable. The task of a deliberate legislative body is to reconcile those very different impulses.
The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Andrea Wright, Senior Lecturer in Teaching and Learning Development, Edge Hill University
In March 1955, an 18-year-old Jim Henson built a puppet from his mother’s old coat, a pair of blue jeans and some ping pong balls. The lizard-like creation first appeared on Afternoon, a television series on Washington D.C.’s WRC-TV, but became a regular on the five-minute Sam and Friends puppet sketch comedy show from May 1955. Over 70 years, the creature evolved into Kermit. The bright green frog now is a cultural icon.
To mark 70 years of The Jim Henson Company, the company has curated an auction of official memorabilia, including puppets, props, costumes and artwork. In a specially-recorded promotional video, Brian Henson, Jim’s son, provides a useful reminder that his father’s legacy is far greater than The Muppets.
Indeed, Henson made a significant contribution to the screen fairytale, a genre all too often dominated by Disney. To encourage fans and viewers to think beyond The Muppet Show and Disney, I offer a reappraisal of his career in my book The Fairy Tales of Jim Henson: Keeping the Best Place by the Fire.
By far the biggest section of the auction is made of items created for the productions and publicity from The Dark Crystal (1982) and the revival Netflix series The Dark Crystal: Age of Resistance (2019). The original fantasy evolved from an idea Henson had to create a story around an anthropomorphised reptilian race, which eventually became the formidable Skeksis.
The trailer for The Dark Crystal.
His collaboration with the British artist Brian Froud led to the evolution of the intricate world of The Dark Crystal. The film follows Jen (voiced by Stephen Garlick), a delicate, fey-like creature from the nearly-extinct Gelfling race. Jen embarks on a quest to save the planet Thra by healing the Dark Crystal. He must complete his mission before the “great conjunction”, an event that would give the evil Skeksis power over the fragile world forever.
This ambitious endeavour was not the first time that Henson had used a fairytale-inspired story or aesthetic. As early as 1958, following a trip to Europe, he began to develop a version of Hansel and Gretel. Although it remained unfinished, fairytales became an established strand in Henson’s work.
This included two unaired pilots called The Tales of the Tinkerdee (1962) and The Land of Tinkerdee (1964), as well as the three television specials that make up Tales from Muppetland (1969-72). The latter are playful, gentle parodies and a Muppetisation of the well-known stories Cinderella, The Frog Prince and The Bremen Town Musicians.
Fairytales even inspired two of Henson’s mid-1960s commercials for The Compax Corporation’s Pak-Nit and Pak-Nit RX – preshrunk fabrics used to make leisurewear. The ads were titled Shrinkel and Stretchel and Rumple Wrinkle Shrinkel Stretchelstiltzkin. Fairytale themes also appeared from time to time in segments of Sesame Street (1969-present) and The Muppet Show (1976–81).
Henson’s film Labyrinth (1986) is a beguiling blend of well-known coming of age fairy stories, most overtly Alice’s Adventures in Wonderland (1865) and The Wonderful Wizard of Oz (1900). These references are combined with original and innovative puppetry and design, and, of course, David Bowie as the charismatic Goblin King.
The trailer for Labyrinth.
One of Henson’s final projects was the imaginative and technically inventive television series Jim Henson’s The Storyteller (1987-89). Inspired by her folklore studies at Harvard University, Lisa Henson encouraged her father to develop a show based on the rich European folk tale tradition, importantly, one that avoided the best-known tales, in favour of more the more unusual and challenging.
Fairytales are an important – and often overlooked – part of Henson’s legacy, from the final productions made during his lifetime to The Jim Henson Company’s later output (for example, Jim Henson’s Jack and the Beanstalk: The Real Story in 2001 and The Dark Crystal: Age of Resistance). Fans are also consistently teased with rumours of a Labyrinth sequel or reboot. Most recently, Robert Eggers is reported to be directing.
Henson should be considered one of the foremost creators of screen fairytales of the 20th century. As his fans celebrate the 70th anniversary of his creations, it’s time for the world to rediscover his magical body of work, beyond the much-beloved Muppets.
Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.
This article features references to books that have been included for editorial reasons, and may contain links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org The Conversation UK may earn a commission.
Andrea Wright does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Frankenstein’s creature is coming back to life – again. As Guillermo del Toro’s new adaptation of Mary Shelley’s gothic masterpiece airs on Netflix, we provide an anatomist’s perspective of her tale of reanimation. Could an assembled body ever breathe, bleed or think?
When Shelley wrote Frankenstein in 1818, anatomy was a science on the edge of revelation and respectability. Public dissection theatres drew crowds, body snatchers supplied medical schools with illicit corpses and electricity promised new insights into the spark of life.
Shelley’s novel captured this moment perfectly. Victor Frankenstein’s creation was inspired by real debates: Luigi Galvani’s experiments on frog legs twitching under electric charge, and Giovanni Aldini’s demonstrations making executed criminals grimace with applied current. To early 19th-century audiences, life might indeed have seemed a matter of anatomy plus electricity.
The first problem for any modern Frankenstein is practical: how to build a body. In Shelley’s novel, Victor “collected bones from charnel houses” and “disturbed, with profane fingers, the tremendous secrets of the human frame”, selecting fragments of cadavers “with care” for their proportion and strength.
From an anatomical perspective, this is where the experiment fails before it begins. Once removed from the body, tissues rapidly deteriorate: muscle fibres lose tone, vessels collapse and cells deprived of oxygen enter necrosis within minutes. Even refrigeration cannot preserve viability for transplantation beyond a few hours.
To reattach limbs or organs would demand surgical anastomosis – precise reconnection of arteries, veins and nerves using microsutures finer than a human hair. The notion that one could sew together entire bodies with “instruments of life” and restore circulation across so many junctions defies both physiology and surgical practice.
Shelley’s description of construction is vague; we estimate that the limbs alone would require over 200 surgical connections. Each piece of tissue would have to be matched to avoid immune rejection, and everything would need to be kept sterile and supplied with blood to stop the tissue from dying.
The electrical illusion
Let’s assume the parts settle into place. Could electricity reanimate the body? Galvani’s twitching frogs misled many into believing so. Electricity stimulates nerve membranes, triggering existing cells to fire – a fleeting simulation of life, not its restoration.
Defibrillators work on this principle: a well-timed shock can reset a fibrillating heart because the organ is already alive, its tissues still capable of conducting signals. Once cells die, their membranes break down and the body’s internal chemistry collapses. No current, however strong, can restore that balance.
The thinking problem
Even if a monster could be made to move, could it think? The brain is our most hungry organ, demanding constant oxygen-rich blood and glucose for energy. A living brain’s vital functions only work under tightly-controlled body temperature and depend on the circulation of fluids – not just blood but cerebrospinal fluid (CSF), too, pumped under appropriate pressure, delivering oxygen and carrying away wastes.
Brain tissue can stay alive for only six to eight hours once it is removed from the body. To keep it going for that long, it has to be cooled on ice or placed in a special oxygen-rich solution. During this time, the brain cells can still work for a while – they can send signals and release chemicals.
Cooling the brain is already used in medicine, for example, after a stroke or in premature babies, to protect the brain and reduce damage. So, in theory, cooling a donor brain before a transplant could help it survive longer.
If we can transplant faces, hearts and kidneys, why not brains? In theory, a rapidly transplanted brain could have its vessels connected to a new body. But the severed spinal cord would leave the body paralysed, without sensation, requiring artificial ventilation.
With circulation restored, pulsing CSF flow and an intact brainstem, arousal and wakefulness might be possible. But without sensory input, could such a being have complete consciousness? As the organ for every memory, thought and action we make, receiving a donor brain would be confusing, programmed with another mind’s personality and legacy of memories. Could new memories form? Yes, but only those born from a body severely limited by the absence of movement or sensation.
Controversial surgeon Sergio Canavero has argued human head transplants may enable “extreme rejuvenation”. But beyond the ethical alarms, this would require reconnecting all peripheral nerves, not just joining the spinal cord – a feat far beyond current capability.
Life support, not resurrection
Modern medicine can replace, repair or sustain many parts once considered vital. We can transplant organs, circulate blood through machines and ventilate lungs indefinitely. But these are acts of maintenance, not creation.
In intensive care units, the boundaries between life and death are defined not by the beating heart, but by brain activity. Once that ceases irreversibly, even the most elaborate support systems can only preserve the appearance of life.
Shelley subtitled her novel The Modern Prometheus for a reason. It is not just a story about science’s ambition, but about its responsibility. Frankenstein’s failure lies not in his anatomical ignorance but in his moral blindness: he creates life without understanding what makes it human.
Two centuries later, we still wrestle with similar questions. Advances in regenerative medicine, neural organoids and synthetic biology push at the boundaries of what life means, but they also remind us that vitality cannot be reduced to mechanism alone. Anatomy shows us how the body works; it cannot tell us why life matters.
The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.