Se puede aprender mejor en la universidad con la inteligencia artificial

Source: The Conversation – (in Spanish) – By Natalia Lara Nieto-Márquez, Profesora e Investigadora en Tecnologías Educativas, Universidad Camilo José Cela

https://www.shutterstock.com/es/image-photo/self-training-focused-millennial-male-student-2566358347

Luisa, una estudiante de grado en la universidad, utiliza ChatGPT como tutor personal. Le pide explicaciones adicionales sobre conceptos complejos (“¿Qué conocimientos previos necesito dominar y por qué son relevantes?” o “¿Cómo se relaciona esta parte nueva del temario con los conocimientos de las anteriores?”) y ejemplos prácticos para profundizar en los trabajos que realiza para clase y estudiar para los exámenes (por ejemplo, le pide que señale los errores conceptuales más frecuentes para autoevaluarse mientras estudia o cómo se podría mejorar y enriquecer la entrega del trabajo para la asignatura).

Martín, estudiante del mismo grado en la universidad, utiliza la IA para pedirle que resuelva los ejercicios de clase. A corto plazo, las entregas de sus tareas son más rápidas, pero cuando llega el examen descubre que su comprensión es superficial porque delegó todo el esfuerzo en la herramienta.

Integridad académica e inteligencia artificial

La inteligencia artificial ha pasado de ser una desconocida a ser un recurso accesible socialmente que está transformando el mundo educativo. El 90 % de los estudiantes universitarios ya utilizan herramientas como ChatGPT en los estudios.

Integrarla en el currículo y en el aula manteniendo la integridad académica obliga a docentes y alumnos a revisar y adaptar sus prácticas.

Esta tecnología puede democratizar el acceso a la educación mediante recursos adaptados a cada estudiante, ayudándoles a superar barreras y a mejorar sus resultados. Pero para que sea así realmente, es imprescindible que su uso sea para potenciar su aprendizaje y no como un sustituto del propio aprendizaje.

Es necesario fomentar habilidades como la metacognición (la capacidad de reflexionar sobre cómo aprendemos) y la autorregulación (nuestra capacidad para gestionar el tiempo, el esfuerzo y dirigir nuestro aprendizaje) para enseñar a los estudiantes a usar la inteligencia artificial para mejorar su rendimiento sin comprometer la integridad académica ni la adquisición de habilidades fundamentales.

Metacognición y aprendizaje autorregulado

En el contexto académico actual, los docentes no solo deben enseñar contenidos: ayudar a los estudiantes a aprender a aprender es fundamental. Es decir, a reflexionar sobre su propio proceso de aprendizaje, y a ser capaces de utilizar la inteligencia artificial como Luisa, y no como Martín.

Como apuntábamos antes, la inteligencia artificial puede potenciar la autorregulación y metacognición de manera que los estudiantes reflexionen sobre su propio aprendizaje y ajusten sus estrategias.

Existen aplicaciones, sistemas de tutoría inteligentes o directamente herramientas como ChatGPT que permiten analizar patrones de estudio de los estudiantes y ofrecen recomendaciones personalizadas. Así, los estudiantes pueden recibir sugerencias de planificación de los tiempos de estudio, qué conceptos reforzar o qué tipos de ejercicios necesitan practicar más según su ritmo y errores.




Leer más:
¿Qué significa autorregularse? La clave para aprender a aprender


Por ejemplo, Luisa ha detectado que aprende mejor con microsesiones diarias de prácticas guiadas y cuestionarios rápidos que genera con la IA. Esta forma de estudiar y la retroalimentación inmediata le ayudan a fijar los conceptos poco a poco. Sin embargo, su compañera Marta, ha descubierto que lo que más le funciona es crear mapas conceptuales y practicar mediante conversaciones con la IA lo que ha ido aprendiendo. Al plasmarlo de forma visual y reformularlo en voz alta, consolida la información y detecta lagunas en su comprensión.

Conversando con estos tutores virtuales, siempre con la guía de un docente, un estudiante puede repasar aquellos conceptos que necesite reforzar o profundizar en el conocimiento de la asignatura. El docente puede apoyar la formulación de preguntas metacognitivas como: “¿Qué puedes interpretar de esta información?” o “¿Cómo podrías utilizar esta información en una nueva situación?”. Al tomar conciencia de cómo aprende, podrá ajustar su forma de estudiar o probar nuevas tácticas de aprendizaje.

El diseño y desarrollo de actividades en el aula a partir de estos procesos fomentan tanto la autoevaluación como el ajuste estratégico del aprendizaje: cada aprendiz puede identificar sus fortalezas y debilidades.

Una manera de profundizar, no un atajo

Se trata de aprender a usar la inteligencia artificial para profundizar en su aprendizaje, y no como un atajo para evitar esfuerzos académicos. La inteligencia artificial facilita el acceso a información y recursos de aprendizaje, y los docentes deben actuar como mediadores entre la tecnología y el estudiante.

Esto significa enseñar a los estudiantes a evaluar críticamente los resultados que obtienen de las herramientas, a cuestionar la veracidad de las fuentes y a reconocer las limitaciones de estas tecnologías.




Leer más:
Talleres prácticos para enseñar a usar la inteligencia artificial a los universitarios


Diseño de actividades en clase

¿Cómo lograrlo en clase? Una estrategia puede ser el diseño de tareas por el docente donde utilizar la IA requiera reflexión. Por ejemplo, pedir a los estudiantes que empleen una herramienta de IA (como Perplexity) para investigar un tema específico de la asignatura, pero luego añadir una presentación en clase de sus hallazgos analizando cómo de confiables les parecieron y cómo verificaron la información. De esta manera, la IA se convierte en el punto de práctica para el aprendizaje profundo y el pensamiento crítico, siendo un medio pero no el fin del camino.




Leer más:
Inteligencia artificial en la universidad: los estudiantes piden nuevas formas de evaluación


Otro ejemplo práctico, a introducir por los profesores, pueden ser los ejercicios de debate donde cada estudiante lleve al aula una respuesta generada con IA sobre una pregunta de la asignatura, y en grupo se analicen esas respuestas, corrigiendo errores y comparando enfoques. Esto les enseña a cuestionar y y mejorar con juicio propio, fortaleciendo su criterio.

Estas actividades no solo guían a los estudiantes hacia un uso más crítico de la tecnología, sino que también destacan la importancia del docente como guía.

Perspectivas para el futuro académico

La inteligencia artificial puede servirnos para transformar la experiencia de aprendizaje en la universidad y la forma en que nos preparamos para un mundo en constante cambio, promoviendo la autonomía y la autoevaluación.

En un mundo donde la tecnología avanza a gran velocidad, aquellos que dominan la inteligencia artificial desde una perspectiva crítica y ética tienen la ventaja de una mentalidad de aprendizaje continuo, indispensable en la era digital.

The Conversation

Natalia Lara Nieto-Márquez recibe fondos de la X Convocatoria de Investigación de la Universidad Camilo José Cela. Este trabajo forma parte del proyecto de investigación con el acrónimo EDUSMART-IA (Estrategias de Aprendizaje Inteligente para un Futuro Académico Sostenible).

ref. Se puede aprender mejor en la universidad con la inteligencia artificial – https://theconversation.com/se-puede-aprender-mejor-en-la-universidad-con-la-inteligencia-artificial-242134

To protect coral reefs, we must also protect the people who depend on them

Source: The Conversation – Canada – By Pedro C. González Espinosa, Postdoctoral Reserach Fellow, The School of Resource and Environmental Management, Simon Fraser University

Coral reefs are vital ecosystems that sustain millions of people, yet they face a growing crisis. Rising ocean temperatures are causing coral bleaching, a process where heat disrupts the relationship between corals and the microalgae living inside them. If the stress continues, the corals may die.

Since the 1980s, bleaching events have increased significantly, posing a major threat to reefs and the coastal communities that rely on them for food, income and protection.

white coral underwater
Prolonged coral bleaching events, caused by environmental stress, can cause coral reefs to die.
(Danielle.ihde/Wikimedia)

Scientists rely on data-monitoring tools to predict where and when bleaching is most likely to occur. These tools help inform conservation decisions made by the reef managers in charge of preserving the reefs, like temporarily pausing tourism or fishing to allow corals to recover.

But important questions are often overlooked in the process: Are these decisions fair? And who bears the cost of protecting coral reefs?

Managing reefs

Predicting bleaching events is crucial for managing reefs. The U.S. National Oceanic and Atmospheric Administration’s Coral Reef Watch uses satellite data to issue real-time bleaching alerts. These alerts guide managers to act before reefs reach dangerous thresholds.

However, the science isn’t perfect. Our research shows that about one-third of bleaching alerts worldwide are false alarms or missed events. If fishing or tourism sites are closed based on a false alarm, it can cause unnecessary economic hardship to local communities. On the other hand, failing to act when bleaching happens risks long-term ecological damage.

Not all reefs are affected equally. The challenges are especially severe in developing countries, where coastal communities depend heavily on reefs for their livelihoods. Many coastal communities do not have the money, government support or backup options they need to protect reefs or cope with the damage.

As a result, coastal communities in developing countries bear the greatest ecological and economic risks. This highlights the need for a rethink of how reef-protection strategies are designed and implemented.

Equity matters

Reef management must be fair, not just to the reefs but also to the people who depend on them. This is where equity comes in. Equity means not only sharing the benefits of healthy reefs, but also ensuring that the costs, such as fishing bans or tourism closures, do not disproportionately fall on those least able to handle them.

There are three key principles that help bring equity to the heart of coral-reef management and the fair sharing of reef resources:

  1. Recognizing equity: Different groups relate to the reef in different ways. For some, it is a source of food; for others, it has cultural or spiritual meaning. Understanding and respecting these different connections — whether scientific, economic or traditional — is essential.

  2. Fair decision-making: Managing the reef should be a collective undertaking. It is important that those who depend on it, like fishing communities, tourism operators and Indigenous groups, have a real say in how the reef is used, and not just more powerful or rich interest groups, like commercial fishers.

  3. Fair sharing of benefits and costs: The benefits of healthy reefs, like fish, tourism income or coastal protection, should be distributed fairly. Likewise, the costs, such as fishing bans or tourism restrictions, should not fall unfairly on those who can least afford it.

These ideas may sound like common sense, but they are often missing in practice. In developing countries, decisions made in the interest of reef conservation can unintentionally harm local communities.

Sustainable and local solutions

Without alternative sources of income or food, restricting fishing or access to reefs can worsen poverty, exacerbate gender inequity or push unsustainable practices to other areas.

In the Solomon Islands, for example, coral reefs are crucial for both food and economic well-being.

In some communities, the food and materials people harvest from reefs, like fish, shells and corals, are worth more than what they earn from other sources. But heavy reliance on reef resources, especially for cash income, has led to over-extraction in some areas, putting both the reefs and local livelihoods at risk.

A community-led planning project in the small island nation of Tuvalu has fostered support for low-impact tourism that balances conservation with livelihoods. Villages identified key sites to protect and developed visitor guidelines to support tourism in a socially responsible and environmentally sustainable manner, balancing reef conservation with local livelihoods.

These examples show that conservation solutions must be co-designed, flexible and tailored to each context. Decisions to close areas or create protected zones should consult with and include local communities.

Reuters reports on the world’s largest coral, discovered in the Solomon Islands.

Toward a just future

Strategies to protect coral reefs need to evolve to include impacted communities. This means reshaping decision-making processes, who is involved and how risks and benefits are shared. It also means addressing global imbalances in conservation leadership.

Many reef initiatives are still led by institutions from wealthy nations, even though the reefs most at risk are in developing countries. In many cases, local communities are invited to participate, but participation alone may not guarantee equity. True equity is about shifting power in leadership and making space for local communities and institutions, providing them with real authority to manage their own resources.

Integrating equity into every stage of coral-bleaching management — including warning systems, impact assessments, stress reduction and policy decisions — not only boosts conservation outcomes, but also ensures that efforts to save the reefs do not come at the expense of the people most dependent on them.

The Conversation

Pedro C. González Espinosa receives funding from the Nippon Foundation Ocean Nexus, School of Resource and Environmental Management (REM), Simon Fraser University (SFU).

ref. To protect coral reefs, we must also protect the people who depend on them – https://theconversation.com/to-protect-coral-reefs-we-must-also-protect-the-people-who-depend-on-them-252546

All women — not just mothers — could benefit from more workplace flexibility

Source: The Conversation – Canada – By Anja Krstic, Assistant Professor of Human Resource Management, York University, Canada

Despite progress toward gender equity, many women continue to take on the majority of unpaid labour within their households, including housework and child care.

On average, women spend twice as much time as men per week on housework (12.6 hours compared to 5.7) and child care (12 hours compared to 6.7).

Unpaid labour also includes cognitive labour — the mental work of anticipating household needs, identifying and weighing options to fulfil them and monitoring whether those needs have been met.

Cognitive labour underpins many physical household and child-care tasks. For example, cooking or shopping for the household requires planning meals around preferences, anticipating various needs, finding alternatives if needed and keeping track of satisfaction with meals and products.

Cognitive labour is often called the “third shift” because it’s largely mental and invisible in nature. This work is often done in the background and is dispersed throughout the day, and women in heterosexual couples tend to shoulder most of it.

As experts in organizational behaviour, we recently conducted a study that found this form of invisible labour also significantly impacts women’s workplace experiences and career outcomes, which ultimately undermines gender equity.

The hidden cost of cognitive labour

For our study, we surveyed 263 employed women and men in heterosexual relationships with employed partners across the United States and Canada. Over seven weeks from April to May 2020, participants reported weekly on the division of cognitive, household, paid and child care labour between them and their partner. They also shared their level of emotional exhaustion, turnover intentions and career resilience.

It’s worth noting that our sample was predominantly white, highly educated and included only those in heterosexual relationships, which may limit how widely these findings apply.

Our results reveal that women take on more cognitive labour than men, even when accounting for the distribution of household and paid labour. This imbalance was linked to greater emotional exhaustion, which, in turn, was associated with a higher likelihood of wanting to leave one’s job and a reduced ability to cope with workplace changes.

In addition, nearly half the participants had at least one child under the age of 18 living with them. This is notable because school and daycare closures during the early days of the COVID-19 pandemic significantly increased child care demands, which women took the brunt of.

We found mothers shouldered a disproportionate amount of child care compared to fathers. Child care — not cognitive labour — was the key predictor of emotional exhaustion, which again resulted in a reduced capacity to cope with their work environment.

In other words, women experienced higher amounts of emotional exhaustion and undermined work outcomes, but the driver varied. For women without children, it was an unequal division of cognitive labour. For mothers, it was unequal child-care responsibilities.

Unpaid labour doesn’t just affect mothers

Much of the research and discourse on unpaid labour tends to conflate it with child care. Yet our findings highlight that unpaid labour affects the careers of both women with and without children.

Work-life balance research and policies often focus on mothers, overlooking the fact that women without children also disproportionately experience burdens at home that can impact their careers.

Our work also contributes to a growing body of research on the work experiences of women without children, who are often rendered invisible in literature. Past research has found that mothers are more likely than their child-free peers to be granted access to flexible work arrangements. Such differences were not found for men with and without children.

This lack of focus reinforces traditional gender stereotypes of women that equate womanhood with motherhood. Our work takes initial steps to address this gap by shedding light on the experiences and challenges that women without children face in managing work and home duties.

How organizations can support all women

Our findings show that women are overburdened by their domestic responsibilities, which can harm their career outcomes and undermine gender equity. But this is not just a personal issue, but an organizational one as well. Organizations have an important role to play in supporting and retaining women in the workplace. Here are several ways they can help.

1. Offer flexible work arrangements.

Organizations can promote a more equitable division of labour within households by offering work arrangements like flexible hours and remote work. Research has shown that such arrangements encourage men to increase their participation in housework and child care.

2. Design flexible work arrangements for all employees, not just parents.

Flexible work arrangements should not be designed with only parents in mind. Women without children also benefit from flexible work arrangements, as they can lessen the strain and resulting career outcomes of cognitive labour. Offering these arrangements to men without children may also encourage them to take on a greater proportion of cognitive labour in their household.

3. Recognize that flexible work arrangements may inadvertently and unfairly benefit men.

Given that women in general take on a greater share of unpaid labour than men, they are more likely to use flexible work arrangements. In contrast, men may use the same flexibility to focus on career advancement. Research has shown that men are more likely to to use parental leave to take on more work, develop human capital or build new skills. Organizations should ensure flexible work policies are used as intended and do not inadvertently advantage men.

4. Normalize the use of flexible work arrangements.

It is not enough for organizations to only offer flexible work arrangements; they must also normalize and encourage their use. Women tend to use them more often because some men fear being viewed negatively for using them. Managers should lessen such fears by communicating that these arrangements won’t lead to penalties, and they should act as role models by using such arrangements themselves.

To better support the challenges that women are facing and promote gender equity, structural changes both within the home and at work are necessary, and organizations play an important role in advancing these changes.

Christianne Varty, researcher and business strategist, co-authored this article.

The Conversation

Anja Krstic’s research has received funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).

Ivona Hideg’s research has received funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).

Janice Yue-Yan Lam’s research has received funding from the Social Sciences and Humanities Research Council of Canada (SSHRC), along with Ontario Graduate Scholarships.

Winny Shen’s research has received funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).

ref. All women — not just mothers — could benefit from more workplace flexibility – https://theconversation.com/all-women-not-just-mothers-could-benefit-from-more-workplace-flexibility-260889

How the UK could reform the European convention on human rights

Source: The Conversation – UK – By Joelle Grogan, Senior Visiting Research Fellow, UCD Sutherland School of Law, University College Dublin

Whether the UK should leave the European Convention on Human Rights (ECHR) has been a debate in UK politics for years. Conservatives have long accused the convention of interfering with government policy on migration and criminal justice, and have debated repealing the Human Rights Act 1998 (which enshrines the convention in UK law).

Stories of foreign criminal deportations stopped over a child’s taste for chicken nuggets, or having a pet cat, have fuelled the debate. These stories (although debunked) give the impression that human rights law undermines border control on the most trivial grounds.

Suella Braverman, who as Conservative home secretary was one of the most vocal advocates for leaving, has laid out a 56-page plan to do so. Current Conservative leader Kemi Badenoch has commissioned a review into whether the UK should leave the ECHR and other international legal agreements.

But there are alternatives to leaving entirely. Labour justice secretary Shabana Mahmood has signalled plans for reform with a focus on foreign criminal deportations. On a visit to Strasbourg in June, Mahmood suggested that there is a perception that “the law too often protects those who break the rules, rather than those who follow them”.

Other signatories to the convention are concerned too – though none have called to leave it. In May 2025, nine countries led by Italy published an open letter calling on the European Court of Human Rights to “restore the right balance” between migration and the ECHR. They want states to have “more freedom” to tackle irregular migration and expel foreign national criminals.

How does the ECHR work?

It’s important to note that the ECHR has no right to asylum, nor a right to enter or remain in a country of which you are not a national. Deporting someone back to their country or to a safe third country does not violate the ECHR.

However, in exceptional cases, a person can challenge their removal on human rights grounds under the convention in UK courts or – very rarely – in Strasbourg. These are the cases that the UK is concerned about.

There are, generally speaking, two routes to this. A person may challenge their removal under Article 3 of the convention (prohibition of torture and other severe ill-treatment) if, for example, there is a serious risk that they may be tortured in the country to which they would be sent.

Or they can do so under Article 8 (the right to respect for private and family life). For example, if they have children who are entirely dependent on them and unable to leave with them.

Article 3 is an absolute right: nothing can justify the use of torture or allowing a person to be tortured. Article 8 is a qualified – not absolute – right. It can be limited where this is lawful, proportionate and necessary to protect the wider public interest. Deporting a foreign national who has committed a criminal offence could be such a case.

If a person believes their rights have been violated through being deported, they can make an application to the European Court of Human Rights, but only if they have exhausted every domestic route in their national courts. This is not an appeal, and the court cannot overturn a domestic judgment or invalidate national law. However, a negative judgment legally obliges the member state to stop the violation and ensure it does not happen again.

Judgments by the European Court of Human Rights against the UK are rare. Since 1980, there have been only four cases where the court ruled that the UK violated the right to family life in a deportation case.

Within the UK, while information on how many foreign national criminal deportations have been stopped on human rights grounds is scarce, the most recent available data suggests that only 2.5% of Article 8 appeals against deportation (or 645 cases over six years) were successful in UK courts. Some of these could have subsequently been overturned, but that information is not publicly available.

How could it be reformed?

As governments throughout Europe look for ways to manage migration, some states are looking at reforming the ECHR on a Europe-wide level.

The text of the convention can be amended with the unanimous consent of all 46 members of the Council of Europe. This would likely take years to negotiate and come into force.

Alternatively, member states can issue a joint declaration to try to influence how the court interprets the convention. This might, for example, call for greater deference to national decisions related to migration and the right to family life.

While it’s certain that many states have concerns regarding migration, they might not necessarily have the same view on what to do about it. Denmark led an effort on ECHR reform in 2018. But its initial draft declaration, which emphasised the primacy of states and the secondary role of the court, was roundly criticised by other member states, and ultimately a much watered-down version was passed.

Reform within the UK

Current immigration rules set by parliament establish the conditions for when Article 8 can be applied.

These rules allow courts to consider how long the foreign offender facing deportation has lived lawfully in the UK, along with how socially and culturally integrated they are, and whether there would be “very significant obstacles” for them to integrate into another country. The rules also allow an Article 8 exception where deportation would be “unduly harsh” for any dependent children.

For serious crimes, foreign offenders “must show very compelling circumstances over and above” these conditions.

The Ministry of Justice has indicated that legislation will be introduced domestically to clarify Article 8 rules and to “strengthen the public interest test” so that fewer cases are treated as “exceptional”.

The government could legislate to require the courts to heavily weight the risk of reoffending, and the threat posed to public safety by the crimes committed, in their decision. These are already implicit when courts balance the rights of the individual with the public interest, and so likely to influence cases only at the margins, but could serve the delicate politics at play without breaching international obligations.

Alternatively, parliament could legislate – as advocated by the Conservatives – to exclude all deportation decisions from the scope of the Human Rights Act. This would abandon the principle that human rights are for everyone, and in many cases, it would allow people to be sent back to conflict zones or unstable countries. Doing so would be tantamount to a departure not just from the ECHR, but from the UK’s commitment to human rights and the rule of law, risking serious political and legal consequences both domestically and to the UK’s international standing. Even then, as former home secretary James Cleverly points out, it would not be a “silver bullet” to removing the obstacles to deportations.

There are no reforms to the ECHR that would “fix” the challenges of irregular migration, the causes of which are largely unrelated to human rights guarantees.

What can be fixed, however, is the lack of accurate information about the extent to which the convention limits migration policy – particularly foreign criminal deportations. For this, review of the application of Article 8 is welcome. Without clarification of where the ECHR fits within the wider pattern of immigration, we’re left with tall tales about cats and chicken nuggets swaying migration policy.

The Conversation

Joelle Grogan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How the UK could reform the European convention on human rights – https://theconversation.com/how-the-uk-could-reform-the-european-convention-on-human-rights-259466

BrewDog’s ‘Equity for Punks’ fuelled its rapid rise – but may have contributed to its struggles

Source: The Conversation – UK – By Ross Brown, Professor in Entrepreneurship and Small Business Finance, University of St Andrews

Graeme J Baty/Shutterstock

Craft brewer and pub chain BrewDog recently closed some of its pubs in a push to cut operating costs. Given it is partly owned by private equity firm TSG Consumer Partners, the loss-making firm is likely to face further organisational upheaval. After all, private equity firms generally specialise in cutting costs and selling assets.

This downsizing is indicative of the widespread demise of the on-trade beer market (that is, venues that sell beer for consumption on site). The sector is seeing six pubs close down in the UK each week.

It is also testament to the importance of a good finance mix and how this affects a firm’s evolution. Throughout BrewDog’s turbulent history the firm has rarely been out of the headlines, beginning when it launched its in-house equity crowdfunding model.

Labelled Equity for Punks, the scheme enabled non-professional investors to obtain small amounts of equity (that is, shares in BrewDog) in return for relatively small levels of investment (approximately £500). The firm says on its website that the scheme offered beer enthusiasts the chance to “own a slice of the brewery” and offered them “pretty awesome perks” including discounted beer.

From its launch in 2009 until the scheme closed in 2021, Equity for Punks raised £75 million and attracted more than 200,000 small-scale investors. This funding model had major upsides for the firm – generating tremendous growth and expansion over the past 15 years. This vast investment enabled BrewDog to open more than 100 bars and restaurants around the world, employing 3,000 staff.

But how does this funding model work – and who benefits?

First, it enables companies such as BrewDog to access substantial levels of funding from non-professional investors to grow the firm quickly. Second, it cements strong brand loyalty in its investor base. In return for relatively small levels of funding, individual investors obtained promotional benefits – access to new products and company events such as annual shareholder meetings.

Equity crowdfunding models like this are often pursued by growth-orientated, consumer-focused firms that want to expand very quickly. By contrast, most small firms favour more modest levels of growth that are more sustainable in the longer term.

The vast majority of small firms rely on debt finance from banks. But a minority of high-tech firms seek investment from professional investors – business angels (wealthy individuals using their own money) or venture capital (or VC – usually provided by an investment firm). For high-tech firms that want to scale up rapidly, sizeable chunks of VC (£10 million-£40 million) is often the most likely funding route.

The Equity for Punks crowdfunding initiative effectively enabled BrewDog to act like a firm-specific, in-house stock market for small-scale investors. But while some of these investors may have been happy just to support a business they believed in, many will have had little knowledge or experience of equity investment and the risks associated with it.

In essence, this generated easy access to finance for BrewDog, with few strings attached. While venture capitalists and angel investors take an active role in the firms they fund, the equity crowdfunding model offers little active participation for these small-scale investors.

Cautionary tale

As such, the benefits for these investors are less evident. Due to the structure of the subsequent fundraising campaigns, the terms and conditions for investors became less favourable and diluted their original equity stakes in the firm.

Although these small-scale investors still own almost one-third of BrewDog, due to the private nature of the firm the shares cannot easily be traded and they derive very little benefit from their investments. This is especially true while the firm is not making profits.

Unless the firm is acquired, creating demand for the shares, there is little opportunity for the equity punks to realise the value of their original investments in BrewDog. In contrast, under the traditional model of equity investment, VCs and angels would push for strategic measures such as a trade sale of the firm to generate a return on their investment.

The experience of BrewDog is a cautionary one for small-scale equity investors. While hugely beneficial for the recipients of the investment, individual investors might lack knowledge about the true value of their investments.

exterior shot of flagship brewdog bar in aberdeen
This BrewDog has had its day. Billed as the flagship bar, in Gallowgate, Aberdeen, it has now closed its doors.
Diana Rebenciuc/Shutterstock

It is not just BrewDog that has provided small-scale equity investors with little return. In the UK, the main equity crowdfunding platforms have raised substantial capital for young businesses which has produced little return for investors.

Platforms like Crowdcube continue to expand rapidly and raise considerable sums for growth-orientated firms such as BrewDog. However, the benefits for investors are often illusory due to a lack of trade sales known as “exits”, which allow investors to sell their stake.

These platforms are of course legitimate means of raising funds and are regulated by the Financial Conduct Authority.

Some academic research suggests, however, that a lack of due diligence on the part of the platforms can lead to firms with limited track records gaining substantial sums of investment. This can open up the potential for fraudulent behaviour, which economists call the risk of moral hazard.

Investors are not a homogeneous group and have vastly different levels of knowledge surrounding the risks associated with equity investments.

The BrewDog story has become a ubiquitous and commonly used case study by business school academics. Rapid access to vast sums of capital allowed the firm to grow at breakneck speed but with little in the way of stakeholder guidance, supervision and stewardship from investors.

If BrewDog had undertaken more sustainable growth using conventional sources of finance, it’s possible that the firm would be in better shape than it is now. While growth is a policy mantra, the “rollercoaster” nature of rapid growth can entail considerable woes for the entrepreneurs and firms involved.

In a nutshell, small-scale investors were left exposed, with little in the way of concrete returns. For many of them, their beer dreams will have fallen flat. But nonetheless, the growth of equity crowdfunding in recent years has been huge. As such, there’s a case to be made for greater investor protection in this arena.

BrewDog and Crowdcube were approached about the claims made in this article but declined to comment.

The Conversation

Professor Ross Brown receives funding from the ESRC under grant number ES/W010259/1 for the project ” Understanding how constraints on access to finance and under-investment impact on productivity growth in smaller firms”.

ref. BrewDog’s ‘Equity for Punks’ fuelled its rapid rise – but may have contributed to its struggles – https://theconversation.com/brewdogs-equity-for-punks-fuelled-its-rapid-rise-but-may-have-contributed-to-its-struggles-261909

How the internet and its bots are sabotaging scientific research

Source: The Conversation – UK – By Mark Forshaw, Professor of Health Psychology, Edge Hill University

There was a time, just a couple of decades ago, when researchers in psychology and health always had to engage with people face-to-face or using the telephone. The worst case scenario was sending questionnaire packs out to postal addresses and waiting for handwritten replies.

So we either literally met our participants, or we had multiple corroborating points of evidence that indicated we were dealing with a real person who was, therefore, likely to be telling us the truth about themselves.

Since then, technology has done what it always does – creating opportunities for us to cut costs, save time and access wider pools of participants on the internet. But what most people have failed to fully realise is that internet research has brought along risks of data corruption or impersonation which could be deliberately aiming to put research projects in jeopardy.

What enthused scientists most about internet research was the new capability to access people who we might not normally be able to involve in research. For example, as more people could afford to go online, people who were poorer became able to participate, as were those from rural communities who might be many hours and multiple forms of transport away from our laboratories.

Technology then leapt ahead, in a very short period of time. The democratisation of the internet opened it up to yet more and more people, and artificial intelligence grew in pervasiveness and technical capacity. So, where are we now?

As members of an international interest group looking at fraud in research (Fraud Analysis in Internet Research, or Fair), we’ve realised that it is now harder than ever to identify if someone is real. There are companies that scientists can pay to provide us with participants for internet research, and they in turn pay the participants.

While they do have checks and balances in place to reduce fraud, it’s probably impossible to eradicate it completely. Many people live in countries where the standard of living is low, but the internet is available. If they sign up to “work” for one of these companies, they can make a reasonable amount of money this way, possibly even more than they can in jobs involving hard labour and long hours in unsanitary or dangerous conditions.

In itself, this is not a problem. However, there will always be a temptation to maximise the number of studies they can participate in, and one way to do this is to pretend to be relevant to, and eligible for, a larger number of studies. Gaming the system is likely to be happening, and some of us have seen indirect evidence of this (people with extraordinarily high numbers of concurrent illnesses, for example).

It’s not feasible (or ethical) to insist on asking for medical records, so we rely on trust that a person with heart disease in one study is also eligible to take part in a cancer study because they also have cancer, in addition to anxiety, depression, blood disorders or migraines and so on. Or all of these. Short of requiring medical records, there is no easy answer for how to exclude such people.

More insidiously, there will also be people who use other individuals to game the system, often against their will. We are only now starting to consider the possibility of this new form of slavery, the extent of which is largely unknown.

Enter the bots

Similarly, we are seeing the rise of bots who are pretending to be participants, answering questions in increasingly sophisticated ways. Multiple identities can be fabricated by a single coder who can then not only make a lot of money from studies, but also seriously undermine the science we are trying to do (very concerning where studies are open to political influence).

It’s getting much more difficult to spot artificial intelligence. There was a time when written interview questions, for example, could not be completed by AI, but they now can.

It’s literally only a matter of time before we will find ourselves conducting and recording online interviews with a visual representation of a living, breathing individual, who simply does not exist, for example through deepfake technology.

The capture poster.
The capture highlights the growing problem of deepfakes.
wikipedia

We are only a few years away from such a profound deception, if not months. The British TV series The Capture might seem far-fetched to some, with its portrayal of real-time fake TV news, but anyone who has seen where the state of the art now is with respect to AI can easily imagine us being just a short stretch away from its depictions of the “evils” of impersonation using perfect avatars scraped from real data. It is time to worry.

The only answer, for now, will be to simply conduct interviews face-to-face, in our offices or laboratories, with real people who we can look in the eye and shake the hand of. We will have travelled right back in time to the point a few decades ago mentioned earlier.

With this comes a loss of one of the great things about the internet: it is a wonderful platform for democratising participation in research for people who might otherwise not have a voice, such as those who cannot travel because of a physical disability, and so on. It is dismaying to think that every fraudster is essentially stealing the voice of a real person who we genuinely want in our studies. And indeed, between 20–100% of survey responses have been found as fraudulent in previous research.

We must be suspicious going forward, when our natural propensity as amenable people who try to serve humanity with the work we do, is to be trusting and open. This is the real tragedy of the situation we find ourselves in, over and above that of the corruption of data that feed into our studies.

It also has ethical implications that we urgently need to consider. We do not, however, seem to have any choice but to “hope for the best but assume the worst”. We must build systems around our research, which are fundamentally only in place in order to detect and remove false participation of one type or another.

The sad fact is that we are potentially going backwards by decades to rule out a relatively small proportion of false responses. Every “firewall” we erect around our studies is going to reduce fraud (although probably not entirely eliminate it), but at the cost of reducing the breadth of participation that we desperately want to see.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. How the internet and its bots are sabotaging scientific research – https://theconversation.com/how-the-internet-and-its-bots-are-sabotaging-scientific-research-261796

Planning to take a degree taught in English when it’s not your first language? Here are some tips for success

Source: The Conversation – UK – By Una Cunningham, Professor emerita, Department of Teaching and Learning, Stockholm University

fizkes/Shutterstock

Every year, millions of students from all parts of the globe study for a degree through a language other than their first, usually English. In 2023, 25% of all higher education students in the UK were international students.

The understanding is that the incoming students will have, or develop, enough proficiency in English as a second language to study engineering, history, physics and other courses taught in English.

English-medium courses are also offered in countries where English is not the first language. In Sweden, where English has no official status, 66% of master’s programmes were taught through English in 2020. Universities in France primarily attract overseas students from Francophone Africa, to study in French, but they also offer courses taught in English.

Where domestic and international students study together, even those students who stay at home get to have an international experience when they meet students from other countries.

In some parts of the world, English is preferred over local languages for domestic students, in the belief that this will equip them for their professional lives.

Learning plans

If you’re planning on taking a degree taught in English and it’s not your first language, you already know that it will probably be more challenging than learning in your mother tongue would be.

Following what is being explained in lectures may be more difficult, and you may come up against unfamiliar vocabulary in the course literature. Group work can also be hard if students have varying levels of proficiency in English.

Lecturers may also be uncomfortable helping students with English, and do not see themselves as language teachers, even though all students need to become familiar with the specific language used in the field they are studying.

Overhead shot of students working on group project
Group work comes with challenges but can also allow useful collaboration.
ESB Professional/Shutterstock

Fortunately, there is a lot you can do as a student to meet the challenges of studying in a second language.

  • Continue to work on your academic literacy. Vocabulary development is key to understanding academic texts and lectures.

  • Keep a list of key concepts and expressions related to the field you are studying as you come across them in your reading and lectures. Add translations into your strongest languages. Use a dictionary to get the exact meanings of words.

  • Do the assigned reading in good time. During your reading and lectures you can take well-structured notes in any or all of your languages. Use technology to support your reading, but be careful of mistakes made by automatic translation.

  • Research effective reading and note taking strategies. Use any study support your university offers. Practise writing in English regularly – free writing or copying out paragraphs from your set texts will develop your writing fluency.

  • Before lectures you may be able to access the lecturer’s slides. Make sure you understand them. Annotate them in your first language. Becoming familiar with course materials before a lecture or other activity can support learning by reducing the amount of new information you need to deal with in class.

  • If possible, arrange a study group with other students who share your first or another language. You each read the course literature and then discuss it together in the languages you choose, to make sure everyone is on board. If the lecturer has made summaries of the literature, or shares lecture slides, discuss them before or after lectures to make sure you have understood the main points.

  • Consider multilingual collaborative note taking with other students, so that you all can access and contribute to a shared document, possibly based on the lecture slides (but be aware that these notes cannot replace your independent classwork).

  • You may be reluctant to ask questions in class, but it is important that you are clear on what you are expected to do. Your question helps the lecturer see what is difficult, and others are probably wondering the same thing.

  • Plan and write a first draft of written work using any or all of your languages. This is called translanguaging – using all the language skills that you have at your disposal to think freely about your work. If you stick to what you can easily express in English you may limit your thinking.

You don’t need to do all your studying only in English. Use your linguistic resources to make the most of your opportunities.

The Conversation

Una Cunningham does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Planning to take a degree taught in English when it’s not your first language? Here are some tips for success – https://theconversation.com/planning-to-take-a-degree-taught-in-english-when-its-not-your-first-language-here-are-some-tips-for-success-254833

What to do when wasps crash your picnic – a scientist’s guide to dining safely with these insects

Source: The Conversation – UK – By Seirian Sumner, Professor of Behavioural Ecology, UCL

Wasps get a hankering for jam once the colony larvae pupate. victoras/Shutterstock

It’s summer in the northern hemisphere and that means sun, sea – and wasps.

A lot of us have been taught to fear wasps as aggressive insects that exist only to make our lives a misery. But with unsustainable wildlife loss across the planet, we need to learn to live alongside all organisms – even wasps. They are important pollinators and predators of insects.

A little knowledge about their natural history can help you dine safely alongside wasps.

The wasps that usually visit your picnic are typically the common yellowjacket (Vespula vulgaris) and the German wasp (Vespula germanica). They seem to appear from nowhere. What should you do?

1. Stay still, or she’ll think you’re a predator

Her (all workers are female) smell receptors have got her to your picnic table, but she’s now using visual landmarks (you and your surroundings) to orientate her way to the food on your plate. Keep your mouth closed and avoid breathing heavily to minimise the release of carbon dioxide, which wasps use as a cue that a predator is attacking. Similarly, if you start flapping and shouting, you are behaving like a predator (mainly badgers in the UK), which might trigger the wasp’s attack mode.

2. Watch what she is eating

This is a worker wasp. She is looking for food to feed to her sibling larvae in her mother’s papery looking nest. Is she carving off a lump of ham, gathering a dollop of jam or slurping at your sugary drink? Watch what she is eating because this gives you a clue to what your wasp offering will be. She is so focused on her task that she won’t notice you watching.

3. Make a wasp-offering to keep her from bothering you

Before you know it, she’s off with jaws full of jam or a hunk of ham. She might zigzag away from your table – a sign that she is reorientating for a reliable return. Once landmarks are mapped, she will fly straight and fast. If you followed her, she would lead you to her nest. But you are better off using your time to prepare your wasp offering, because she’s going to come back soon. Your offering should be a portion of whatever she harvested from your plate. You can move it slightly away from the rest of your food. If you let her have her share, you too can dine in peace.

You can gradually move your wasp offering further away from you. Wasp offerings are well-tested techniques around the world, whether you’re looking to track down a wasp nest to eat, or keep customers unbothered by wasps at an outdoor restaurant.

Wasp on cake with icing and sprinkles.
Are the wasps at your picnic making a beeline for sweet food?
hecke61/Shutterstock

Happily, your picnic friend is unlikely to bring a swarm of wasps to your table, because social wasps are poor recruiters. This makes sense because wasp food (insects, carrion) is usually a scattered, short-lived resource. One caterpillar doesn’t necessarily mean there’s a huge patch of them, for example.

This contrasts with honeybees, for which there has been strong natural selection for the evolution of a communication system (waggle dance) to recruit many foragers to a patch of flowers.

However, you might get a few wasps at your picnic, especially if the nest is close, just by chance. Wasps tend to be attracted to a forage source by the presence of other wasps. If she sees a few wasps gathered, then she will investigate. But if there are too many wasps, this puts her off.

Wasps’ changing feeding habits

You may already know that wasps go crazy for sugar at the end of the summer. But why do they prefer a protein earlier in the season? It depends on what is going on inside the colony – and this changes with the season.

Wasp larvae are carnivorous. Together, the workers rear thousands of larvae. If your wasp wants ham (or some other protein source) at your picnic, you know her colony is full of hungry larvae. You might notice this in early-to-mid summer – and no later than mid-to-late August.

Enjoy the knowledge that you are helping feed armies of tiny pest controllers, who will soon set to work regulating populations of flies, caterpillars, aphids and spiders.

A defining feature of an adult wasp is the tiny petiole (wasp-waist). This constriction between her thorax and abdomen evolved so her ancestors could bend their abdomens, yoga-style, to parasitise or paralyse their prey.

Two wasps carving up ham slice
These wasps won’t be eating the ham themselves.
Franz H/Shutterstock

The wasp-waist of an adult worker limits her to a largely liquid diet. She is like a waiter who must deliver feasts to customers without tasting it. The larvae tip her service with a nutritious liquid secretion, which she supplements with nectar from flowers. For much of the season, this is enough.

Blend science and a picnic

Towards the end of the summer, most wasp larvae have pupated – and a pupated larva doesn’t need feeding. So, demand for protein foraging diminishes, as do the sweet secretions that have kept the workers nourished.

This means worker wasps must now visit flowers for nectar – although your jam scone or sweet lemonade may also be exceedingly tempting. If your wasp is fixated on sugar at your table, then you know her colony is likely to be in its twilight phase of life.

Although time of the year is a good indicator of the balance of ham-to-jam in a wasp’s foraging preferences, weather, prey availability, local competition and rate of colony growth can influence them too. This means the switch from ham to jam this year may be different to next year.

We’d like you to help us gather data on this, to improve predictions on whether to offer your wasps ham or jam. To take part, report here whether the wasp at your picnic wanted protein (such as chicken, hummus, beef or sausage), jam (or anything sugary, including sugary drinks), or both.

The Conversation

Seirian Sumner receives funding from the UK government’s Natural Environment Research Council (NERC) and the Biotechnology and Biological Sciences Research Council (BBSRC). She is a Trustee and Fellow of the Royal Entomological Society, and author of the book ‘Endless Forms: Why We Should Love Wasps’

ref. What to do when wasps crash your picnic – a scientist’s guide to dining safely with these insects – https://theconversation.com/what-to-do-when-wasps-crash-your-picnic-a-scientists-guide-to-dining-safely-with-these-insects-261589

How ancient viruses could help fight antibiotic resistance

Source: The Conversation – UK – By Franklin Nobrega, Associate Professor, Microbiology, University of Southampton

Phages (red) attacking a bacterium (green). nobeastsofierce/Shutterstock.com

If bacteria had a list of things to fear, phages would be at the top. These viruses are built to find, infect and kill them – and they have been doing it for billions of years. Now that ancient battle is offering clues for how we might fight back against antibiotic-resistant infections.

As more bacteria evolve to withstand our antibiotics, previously treatable infections are becoming harder – and in some cases, impossible – to cure. This crisis, known as antimicrobial resistance (AMR), already causes over a million deaths a year globally, and the number is rising fast. The World Health Organization has named AMR one of the top ten global public health threats.

Phage therapy – the use of phages to treat bacterial infections – is gaining attention as a potential solution. Phages are highly specific, capable of targeting even drug-resistant strains. In some compassionate-use cases in the UK, they have cleared infections where every antibiotic had failed. But phages still face a challenge that is often overlooked: the bacteria themselves.

Bacteria have evolved sophisticated systems to detect and destroy phages. These defences are diverse: some cut up viral DNA, others block entry, and a few launch a kind of intracellular shutdown to prevent viral takeover. In a new study published in Cell, my colleagues and I describe a system that works differently, called Kiwa. It acts like a sensor embedded in the bacterial membrane, detecting early signs of attack.

Exactly what Kiwa is sensing remains an open question, but our findings suggest it responds to the mechanical stress that occurs when a phage latches on to the cell and injects its DNA. Once triggered, Kiwa acts fast. It shuts down the phage’s ability to make the components it needs to build new phages, stopping the infection before it can take over the cell.

But just as bacteria evolve ways to defend themselves, phages evolve ways to fight back. In our latest experiments, we saw two strategies in play.

A bacterium (orange) being attacked by phages (black dots).
A bacterium (orange) being attacked by phages (black dots).
Southampton University, CC BY

Some phages developed small mutations in the proteins they use to attach to the bacterial surface – subtle changes that helped them avoid triggering Kiwa’s detection system. Others took a different approach: they allowed themselves to be detected, but escaped the consequences.

These phages carried mutations in a viral protein that seems to be involved in how Kiwa shuts down the infection. We don’t yet know exactly how this works, but the result is clear: with just a few changes, the virus keeps replicating, even after Kiwa has been activated.

This evolutionary flexibility is part of what makes phages so powerful, and why they hold such promise in treating infections. But it also highlights a key challenge: to make phage therapy effective, we need to understand how these microbial battles play out.

Rules of engagement

If a bacterial strain carries a defence like Kiwa, not all phages will succeed against it. Some might be blocked entirely. But others, with just the right mutations, might slip through. That means choosing or engineering the right phage for the job is not just a matter of trial and error – it is a matter of knowing the rules of engagement.

Studying bacterial defence systems like Kiwa gives us a deeper understanding of those rules. It helps explain why some phages fail, why others succeed, and how we might design better phage therapies in the future. In time, we may be able to predict which bacterial defences a given strain carries, and select phages that are naturally equipped – or artificially tuned – to overcome them.

That is the idea behind our growing phage collection project. We are gathering phages from across the UK and beyond, including from public submissions – dirty water is often a goldmine – and testing them to see which ones can overcome the defences carried by dangerous bacteria. With over 600 types already catalogued, we are building a resource that could help guide future phage therapy, pairing the right phage with the right infection.

Kiwa is just one piece of the puzzle. Bacteria encode many such defence systems, each adding a layer of complexity – and opportunity – to this microbial arms race. Some detect viral DNA directly, others sense damage or stress, and some even coordinate responses with neighbouring cells. The more we learn, the more precisely we can intervene.

This is not a new war. Bacteria and phages have been locked in it for billions of years. But for the first time, we are starting to listen in. And if we learn how to navigate the strategies they have evolved, we might find new ways to treat the infections our antibiotics can no longer handle.

The Conversation

Franklin Nobrega receives funding from Royal Society and Wessex Medical Research.

ref. How ancient viruses could help fight antibiotic resistance – https://theconversation.com/how-ancient-viruses-could-help-fight-antibiotic-resistance-261970

No clear answers on antidepressants in pregnancy

Source: The Conversation – UK – By Urban Wiesing, Professor of Ethics and History of Medicine, University of Tübingen

The US Food and Drug Administration recently convened a panel of experts to examine a sensitive and increasingly urgent question: should antidepressants be prescribed to women suffering from depression during pregnancy?

To the surprise of many in the American medical community, the panel included not only US-based experts but also three international voices known for their critical views on psychiatric medication. Their inclusion sparked immediate controversy and foreshadowed the disagreements to come.

At the heart of the debate is a long-standing assumption in American medical practice: while antidepressants may carry some risk to the unborn child, the dangers of leaving maternal depression untreated are usually greater. Yet this mainstream position was strongly challenged. A majority of the panel appeared unconvinced that the benefits of antidepressant use in pregnancy clearly outweigh the potential risks.

As the discussion unfolded, fundamental questions remained unresolved. What exactly are the risks to the unborn child? The panel offered different answers.

How substantial are the benefits to a pregnant woman? Some experts questioned whether antidepressants deliver meaningful help in these circumstances at all. And without clarity on these points, how can the the risk-benefit ratio be reliably assessed?

It’s a familiar scenario in science: experts looking at the same data but drawing different conclusions – not only about the facts, but how to interpret them. In this case, the division seemed to reflect deeper cultural and philosophical differences in how various countries approach mental health care during pregnancy.

The outcome of the panel’s deliberations reflected that divide, with no consensus reached.

To some extent, the conflict was embedded in the very design of the panel. When those with sharply opposing views are brought together without agreement on the evidence base, gridlock is a likely result. Still, the impasse underlines the need for more independent, high-quality research on the effects of antidepressants during pregnancy – research that can inform not only regulators but also doctors and patients.

Complicating matters further is the political climate. The current US health secretary – Robert F. Kennedy Jr. – has, critics argue, an uneasy relationship with scientific consensus, which makes trust in the process all the more fragile.

FDA expert panel discussion on antidepressants and pregnancy.

A warning label is not a substitute for a conversation

Still, the panel produced one tangible suggestion: a proposal from around half of its members to place a so-called “black box” warning on antidepressant packaging, alerting pregnant women to potential risks to the unborn child. Such warnings are typically reserved for the most serious medical concerns. But is this really the right approach?

A comparison often made is to cigarette packaging. But this analogy quickly breaks down. Cigarettes are freely bought; antidepressants are prescribed following a medical consultation. To issue a blunt warning on a medicine that has already been deemed appropriate by a doctor risks undermining the doctor–patient relationship.

If stronger warnings are needed, the real problem may lie in the consultation process itself, not in the packaging.

Pregnancy presents a unique ethical dilemma. The unborn child cannot give consent, and damage sustained in the womb can result in lifelong consequences. At the same time, untreated depression in a pregnant woman carries serious risks of its own – for both mother and child. This is a classic medical conflict, with no easy solution.

And while US law gives pregnant women the right to make such decisions – albeit with variation across states – it doesn’t solve the underlying uncertainty. That must be navigated through informed, respectful dialogue between doctor and patient, not by resorting to fear-inducing labels.

Ultimately, every case is personal. Every decision must take into account the individual’s mental health, support system, risk tolerance and values. What’s needed is thoughtful communication, prudent prescribing and careful balancing of benefit and harm. In short: good medicine.

What’s not needed is to heap more guilt on women already grappling with depression. If scientists and policymakers cannot agree, pregnant women should not bear the burden of that confusion. They deserve support, not stigma.

The Conversation

Urban Wiesing does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. No clear answers on antidepressants in pregnancy – https://theconversation.com/no-clear-answers-on-antidepressants-in-pregnancy-261724