Para comprender y gestionar mejor los incendios forestales debemos observar el paisaje

Source: The Conversation – (in Spanish) – By Irene Repeto Deudero, Doctoranda en Biología, Universidad de Cádiz

Plantación de Pino marítimo (_Pinus pinaster_) en la Sierra de la Culebra (Zamora) un año después del incendio de 2022. Juli G. Pausas

En los últimos años, los incendios forestales se han vuelto más extensos, más frecuentes y más graves en muchas partes del mundo. Los veranos son más calurosos, las sequías más largas y los incendios que hace unas décadas eran excepcionales hoy resultan algo habitual.

Según el Sistema Europeo de Información sobre Incendios Forestales, en agosto de 2025 ardieron más de 540 000 hectáreas en la península ibérica, la cifra más alta desde que empezaron estos registros.

Paisajes ricos en combustible

Ante la pregunta de si el cambio climático está detrás de ellos, la respuesta no es nada sencilla. Sin duda, el cambio climático alarga las temporadas de riesgo y crea condiciones más propicias para que ocurra un incendio, pero no es la única causa. Los incendios forestales también dependen de las fuentes de ignición, del terreno y de la vegetación que los alimenta.

Para entender los incendios actuales también tenemos que mirar al paisaje que hemos construido.

Desde el siglo XIX se extendieron por toda Europa diversos programas de plantación a gran escala diseñados con el fin de producir madera, proteger el suelo o potenciar el desarrollo económico. En España, se plantaron más de 5 millones de hectáreas, la mayoría con especies de pino, un legado que ha marcado profundamente el paisaje actual.

Con el tiempo, el abandono rural ha aumentado la cantidad y la continuidad de la biomasa, creando paisajes ricos en combustible que ahora son muy vulnerables al fuego. Pero, ¿cómo influyen los diferentes tipos de combustible en la severidad de un incendio y en su recuperación? ¿Puede la gestión reducir el impacto del fuego en los paisajes dominados por plantaciones?




Leer más:
Incendios forestales: no todo es cambio climático


Lecciones aprendidas de los incendios en España

Nuestro equipo estudió tres grandes incendios en España (Sierra Bermeja, La Culebra y Las Hurdes, ocurridos entre 2021 y 2023) y encontramos que las plantaciones de pino ardieron con mayor severidad que robledales, bosques mixtos o matorrales. Además, estos tres últimos tipos de vegetación mostraban signos claros de recuperación después de un año, mientras que las plantaciones continuaron prácticamente desérticas.

Un año después del incendio, el paisaje de las áreas forestadas no se parecía al ecosistema original, pero tampoco a una plantación funcional. También observamos que las plantaciones pueden poner en riesgo a sus vecinos. En los tres incendios, la intensidad del fuego en la vegetación colindante fue mayor cuanto más cerca estuvieran de dichas plantaciones.

Sin embargo, no todas las plantaciones se quemaron por igual. Detectamos un umbral en torno a los 440 pinos por hectárea: por encima de esa densidad, la severidad del incendio se dispara. En cambio, en aquellas plantaciones en las que se había gestionado la densidad de árboles y el sotobosque, las consecuencias del incendio fueron mucho menos graves.

Estas observaciones suponen una buena noticia, porque significa que en estas plantaciones la gestión puede marcar la diferencia. Prácticas como los clareos, las podas o la tala selectiva bajan la carga y la continuidad del combustible, y fueron efectivas para reducir el impacto del fuego en esas zonas. Pasar de una plantación abandonada a una gestionada podría marcar la diferencia entre un incendio que se puede apagar u otro extremadamente difícil de manejar.




Leer más:
Estrategias de gestión forestal para adaptar el paisaje a un mundo más cálido y proclive a los incendios


Fomentar un paisaje más diverso y menos inflamable

En el pasado, las plantaciones de pino tuvieron un papel clave en la economía europea y configuraron gran parte del paisaje que vemos hoy en día. Algunas repoblaciones ayudaron a frenar la pérdida de suelo tras décadas de sobreexplotación de los montes, y los salarios de la actividad forestal permitieron que muchas familias vivieran con dignidad en una economía de posguerra.

Sin embargo, en un contexto donde las condiciones climáticas son cada vez más duras, esas plantaciones están en riesgo de convertirse en puntos críticos para los grandes incendios.

Sierra Bermeja (2021, Málaga, España) un año después del incendio
Paisaje en Sierra Bermeja (Málaga) un año después del incendio de 2021.
Juli G. Pausas

Los resultados de nuestra investigación demuestran que no toda la vegetación arde de la misma forma, sino que el tipo, la densidad y la conectividad del combustible determinan las consecuencias del fuego en el ecosistema y su capacidad para recuperarse. Este hallazgo es importante: implica que hacer inversiones fuertes en gestión forestal ya no es opcional.

Debemos dejar atrás el enfoque centrado en la extinción de incendios una vez han ocurrido y empezar a apostar por reducir la vulnerabilidad del territorio antes de que se origine el fuego.

Fomentar un paisaje más diverso y menos inflamable, además de mantener las plantaciones bajo manejo activo, es una estrategia realista y eficaz. Las medidas que implica no solo aportan beneficios a corto plazo, sino que también fortalecen los planes regionales de prevención y adaptación al fuego. Además, complementan, que no sustituyen, las estrategias globales enfocadas a mitigar el cambio climático.

Si bien su puesta en marcha supone grandes costes y retos logísticos, también ofrece oportunidades para impulsar una industria forestal más sostenible, paliar las consecuencias del abandono rural y promover paisajes donde las plantaciones formen parte de un equilibrio entre la productividad económica y la conservación de los ecosistemas.

En un mundo cada vez más cálido y propenso a los incendios, plantar árboles como estrategia de mitigación climática conlleva riesgos importantes. Sin embargo, a diferencia del clima o la topografía, el combustible sí está bajo nuestro control.

Repensar qué especies, dónde y cómo las plantamos, y sobre todo, qué ocurre con ellas después, es una condición esencial para construir paisajes más resilientes al fuego y, en definitiva, para aprender a convivir con él.

The Conversation

Irene Repeto Deudero recibió fondos de la Universidad de Cádiz.

ref. Para comprender y gestionar mejor los incendios forestales debemos observar el paisaje – https://theconversation.com/para-comprender-y-gestionar-mejor-los-incendios-forestales-debemos-observar-el-paisaje-268444

Sensores basados en bacterias, los nuevos guardianes invisibles del agua

Source: The Conversation – (in Spanish) – By Anna Salvian, Investigadora posdoctoral del Grupo BioE, IMDEA AGUA

“El agua es la fuerza motriz de toda la naturaleza”, escribió Leonardo Da Vinci. Abrir el grifo y que salga agua limpia parece sencillo, pero detrás hay un sistema complejo que va desde la captación y distribución hasta la depuración y, cada vez más, la reutilización.

Hoy ese equilibrio se complica: el cambio climático, la escasez de recursos, la contaminación y el elevado consumo energético del tratamiento hacen cada vez más difícil garantizar un suministro seguro y sostenible. Afrontar estas amenazas exige avanzar hacia una economía circular del agua, con decisiones estratégicas basadas en datos y más orientadas a la eficiencia y la resiliencia.

La revolución de los biosensores bioelectroquímicos

Hasta ahora, controlar la calidad del agua en cada etapa de su ciclo significaba recoger muestras y analizarlas en el laboratorio. El método es fiable, pero lento y costoso, y no siempre refleja lo que ocurre en tiempo real. Por eso, disponer de tecnologías que permitan controlar al instante y con fiabilidad la calidad del agua es esencial para optimizar su gestión a lo largo de todo el ciclo, desde la captación hasta su uso, tratamiento y reutilización.

En este contexto, los biosensores bioelectroquímicos destacan por su versatilidad y capacidad de adaptación a las distintas fases del ciclo del agua. Estos dispositivos emplean microorganismos capaces de “alimentarse” de los contaminantes presentes en el agua, utilizando esas sustancias como fuente de energía.

Durante este proceso metabólico, las bacterias liberan electrones –partículas atómicas cargadas negativamente–, que son captados por el sensor y transformados en una señal eléctrica medible. De esta manera, el nivel de corriente generado refleja directamente la actividad biológica y el grado de contaminación del agua en tiempo real.

Colocados en diferentes puntos del ciclo del agua, estos dispositivos permiten:

  • Detectar la contaminación en origen, antes de que llegue al consumidor.

  • Optimizar el tratamiento en las depuradoras.

  • Garantizar la seguridad de la reutilización.

Antes de la depuración: función preventiva

El ciclo comienza en manantiales, ríos y acuíferos, fuentes cada vez más expuestas a contaminantes químicos, vertidos ilegales o infiltraciones de aguas residuales.

Aquí, los biosensores instalados en aguas subterráneas o superficiales permiten detectar en continuo la presencia de contaminantes. Su función es preventiva: ayudan a evitar intoxicaciones y a garantizar que el agua llegue en condiciones seguras a las plantas de potabilización.

Por ejemplo, se ha demostrado que los biosensores bioelectroquímicos pueden detectar la presencia de hidrocarburos derivados del petróleo, un avance fundamental, ya que estos compuestos se encuentran entre los contaminantes más comunes de las aguas subterráneas.

Durante: tratamientos más eficientes

Tras su uso, el agua llega a las estaciones depuradoras, donde se eliminan los contaminantes antes de devolverla al medio natural. En esta fase, los biosensores bioelectroquímicos desempeñan un doble papel.

Por un lado, permiten monitorizar la carga total de contaminantes orgánicos de entrada: cuanto mayor es la carga orgánica, más electricidad generan las bacterias del sensor, y esa corriente eléctrica puede medirse para estimar la cantidad de materia que debe ser tratada.

En un estudio desarrollado por científicos de España y Reino Unido, demostramos que las comunidades bacterianas que crecen en el ánodo (uno de los electrodos) de estos sensores son muy resistentes, lo que les permite funcionar incluso en aguas sucias o entornos adversos sin perder eficacia.

Esta información es especialmente útil porque las depuradoras, aunque son instalaciones esenciales, tienen un alto consumo energético: gran parte de la electricidad se destina a la aireación de los reactores biológicos necesaria para que los microorganismos degraden los contaminantes orgánicos.

Al medir en tiempo real la demanda de oxígeno de los microorganismos para la degradación o la carga contaminante, los operadores pueden ajustar la aireación de forma dinámica. De esta forma, reducen el consumo eléctrico, disminuyen las emisiones de gases de efecto invernadero y mantienen la calidad del agua tratada. Una ventaja doble: económica y ambiental.

Por otro lado, estos sensores también pueden detectar la presencia de sustancias tóxicas que alteran la actividad de las bacterias encargadas de depurar el agua. Dado que el tratamiento biológico depende de la salud de estos microorganismos, es crucial asegurarse de que no estén expuestos a compuestos que los dañen.

En este contexto, se han desarrollado biosensores bioelectroquímicos capaces de identificar cambios en la actividad metabólica microbiana provocados por floculantes –sustancias empleadas en procesos industriales o en las depuradoras para aglomerar partículas– o sus residuos tóxicos, metales pesados y biocidas, como los pesticidas. Este sistema ofrece una señal temprana de toxicidad, permitiendo actuar de inmediato y proteger el equilibrio biológico del proceso de depuración.

Después: agua segura para su reutilización

Cada vez más, el ciclo del agua se cierra con la reutilización. En un contexto de sequías, el agua regenerada se destina al riego agrícola, la limpieza urbana o incluso a procesos industriales.




Leer más:
El reto de regenerar las aguas residuales para usarlas en agricultura


Pero para reutilizar el agua se necesita garantizar su buena calidad. Los biosensores permiten vigilar en tiempo real el agua depurada, asegurando que cumple los estándares de seguridad antes de darle un nuevo uso. Gracias a ello, se fortalece la confianza en la reutilización y se avanza hacia un modelo de economía circular.

Además, este tipo de sensores no se limita al agua. También pueden aplicarse al estudio del suelo, especialmente en terrenos regados con agua depurada. Es posible monitorizar cómo evoluciona la actividad microbiana y las condiciones del suelo para garantizar que la reutilización del agua no altere su equilibrio biológico.

Esa información es muy valiosa porque la vida microbiana del suelo está directamente ligada a su fertilidad: un suelo con un microbioma equilibrado y activo favorece una mejor disponibilidad de nutrientes y, en consecuencia, una mayor productividad de los cultivos.




Leer más:
Guía para conocer los microorganismos que dan vida a los suelos y aseguran el futuro agrícola


Del control al futuro predictivo

La gran ventaja de los biosensores es que permiten pasar de un sistema reactivo a un sistema predictivo. Ya no se trata solo de comprobar la calidad del agua cuando el problema ya ha ocurrido, sino de anticiparse, gestionar mejor los recursos y responder en tiempo real.

Estos modelos predictivos son posibles gracias a la recopilación de grandes volúmenes de datos (big data) y al uso de herramientas de aprendizaje automático (machine learning) y aprendizaje profundo (deep learning). Todo ello permite analizar patrones y predecir el comportamiento futuro con gran precisión.

Un grupo de investigación en Estados Unidos ya ha aplicado estas técnicas a biosensores bioelectroquímicos, logrando identificar las variables que influyen en la generación de corriente eléctrica del sensor y predecir la eliminación de carbono y nitrógeno durante el proceso de depuración.

Integrados en la digitalización del ciclo del agua, estos avances abren la puerta a un modelo de gestión más transparente, eficiente y respetuoso con el medio ambiente, que protege la salud pública, mejora la eficiencia energética y reduce la huella de carbono de las infraestructuras hídricas.

Un futuro más seguro y sostenible

En un mundo donde la demanda de agua podría superar en un 40 % a los recursos disponibles en 2030, apostar por la innovación tecnológica es imprescindible. Los biosensores se perfilan como aliados clave para garantizar un agua limpia, segura y gestionada con criterios de sostenibilidad.

El agua, como decía Da Vinci, es la fuerza que mueve la naturaleza. Hoy, gracias a la ciencia, tenemos nuevos guardianes invisibles para protegerla: biosensores que la vigilan gota a gota, en tiempo real.

The Conversation

Anna Salvian recibe financiación del Programa de Investigación e Innovación Horizonte Europa de la Unión Europea (proyecto n.º 101058174 “TrineFlex”).

ref. Sensores basados en bacterias, los nuevos guardianes invisibles del agua – https://theconversation.com/sensores-basados-en-bacterias-los-nuevos-guardianes-invisibles-del-agua-268042

¿Cómo podemos conocer la historia de la Tierra?

Source: The Conversation – (in Spanish) – By Laura Damas Mollá, Investigadora en Geología, Universidad del País Vasco / Euskal Herriko Unibertsitatea

Formaciones con estratos bien visibles en Zumaia (Gipuzkoa, España). Guillermo Guerao Serra/Shutterstock

Este artículo forma parte de la sección The Conversation Júnior, en la que especialistas de las principales universidades y centros de investigación contestan a las dudas de jóvenes curiosos de entre 12 y 16 años. Podéis enviar vuestras preguntas a tcesjunior@theconversation.com


Pregunta formulada por el curso de 2º de la ESO del Instituto de Educación Secundaria Miguel de Unamuno, en Gasteiz (Álava)


Nuestro planeta es mucho, muchísimo más antiguo que cualquier civilización humana. Sus 4 500 millones de años de historia han dejado numerosas huellas en las rocas, y la geología es la ciencia que se encarga de descifrarlas para que podamos leer los diferentes capítulos del “libro de la Tierra”, con tramas que se entrelazan.

Hablan los sedimentos

Antes que nada, recordemos que las rocas tienen varios orígenes: a través de magma o lava se forman las ígneas y por transformación de otras rocas, las metamórficas. Pero aquí las que más nos interesan, de momento, son las del tercer grupo: las sedimentarias.

Las rocas sedimentarias nacen al acumularse partículas, más o menos finas, generadas por la descomposición y erosión de rocas que afloran en relieves elevados como montañas. Estas partículas son transportadas por distintas vías (ríos, viento, glaciares…) hacia las zonas de acumulación. Así, los sedimentos se van depositando, capa a capa, en niveles más o menos horizontales: los estratos.

Estratos de roca en Muskiz (Bizkaia, España).
Laura Damas Mollá

Según la llamada ley de horizontalidad de los estratos, establecida por el científico danés Nicolás Steno en el siglo XVII, el estrato inferior es el más antiguo. El problema es que no nos los encontramos siempre así.

Acantilados del Eoceno (hace entre 56 y 33,9 millones de años) formados por una alternancia de dos tipos de rocas: areniscas y lutitas. Aquí, los estratos se disponen verticalmente. San Sebastián, Gipuzkoa.
Laura Damas Mollá

Al estudiar las rocas sedimentarias, el primer dato que debemos deducir es dónde se depositaron sus materiales, o sea, saber si se formaron en mares, ríos, lagos…. Los restos fósiles de seres vivos nos pueden proporcionar la solución del problema. Así, una roca caliza con fósiles de organismos marinos, como moluscos o corales, surgió en un ambiente tropical de aguas poco profundas, porque la fauna es similar a la actual. Si esos fósiles se encuentran enteros podemos incluso llegar a reconstruir los arrecifes.

Por otro lado, si nos encontramos los fósiles rotos y mezclados, supondremos que algún tipo de corriente los ha removido y desplazado de su hábitat. En otras ocasiones, como ocurre con las areniscas, presentan finas líneas o lineaciones que marcan la dirección e incluso el sentido de las corrientes que transportaban los sedimentos, igual que pasa actualmente en la playa.

Estas piezas del puzle de la historia terrestre se interpretan según el principio del “actualismo”. Acuñado por el geólogo británico Charles Lyell en 1830, indica que “el presente es la clave para entender el pasado”.

Sección longitudinal de molusco bivalvo (rudista) del Cretácico, hace entre 143 millones y 66 millones de años. Cantera de Andrabide (Gautegiz Arteaga, Bizkaia).
Laura Damas Mollá

Detalles que delatan la edad

Con estas pistas es posible interpretar el ambiente donde se depositaron las rocas sedimentarias, pero aún no sabemos su edad. Para averiguarla también existen varias técnicas.

En primer lugar, los minerales que forman esas rocas contienen isótopos radioactivos, componentes químicos cuyo análisis nos permite saber cuánto tiempo llevan en la Tierra. Este método se basa en la desintegración de un isótopo “padre” que se va transformando de forma progresiva a lo largo del tiempo en su “hijo”. Al conocer la proporción existente entre ellos en la muestra se puede obtener su edad.

En el caso de la célebre técnica del carbono-14 se necesitan muestras con un origen orgánico, por lo que no se puede aplicar en muchas rocas y minerales. Además, la “vida” de ese isótopo es de poco más de 60 000 años. Para rocas, minerales y fósiles utilizamos otras relaciones de isótopos radioactivos, como el uranio/torio o el uranio/plomo, que permiten dataciones de entre 500 000 años y varios miles de millones de años, más adecuadas para conocer la larga historia de la Tierra.

Y una curiosidad: ¿sabías que existe también una técnica para saber la edad de la última vez que ha visto la luz del sol un grano de cuarzo? Se llama luminiscencia ópticamente estimulada y se utiliza para estimar la antigüedad de muestras de entre 1 000 y 500 000 años.

Pero estas herramientas no sirven para todo tipo de rocas, así que también usamos otros métodos de datación. El más conocido consiste en examinar la variación del contenido fósil a lo largo del tiempo; es decir, la evolución. La vida de la Tierra se transforma a lo largo del tiempo, y encontrar determinadas asociaciones de fósiles nos permite establecer un rango de edad para los estratos. Aunque los más famosos son los grandes fósiles, como los dinosaurios, normalmente utilizamos microfósiles que se estudian con lupas.

Rocas sometidas a “torturas” geológicas

Pero la historia de las rocas está incompleta si solo averiguamos el ambiente donde se forjaron y su edad. Diferentes procesos geológicos hacen que rocas nacidas en fondos marinos, por ejemplo, formen parte de las montañas actuales. Porque desde que se produce el depósito de los materiales hasta la actualidad, las rocas sedimentarias sufren un proceso que se llama diagénesis: se calientan, se aplastan por enterramiento y experimentan diversos cambios en sus componentes (algunos se disuelven, otros se transforman, otros se fracturan…).

La mayor parte del tiempo, una roca sedimentaria está sometida a esas “torturas”, que podemos entender y ordenar cronológicamente. Para ello utilizados unos microscopios especiales, llamados petrográficos, y láminas de rocas de 0,3 mm de espesor.

Y por si esto fuera poco, los estratos no siempre se encuentran en posición horizontal, como las capas de una tarta. Igual que cuando empujamos un mantel con la mano, las fuerzas de las placas tectónicas pliegan los estratos rocosos. Los geólogos tenemos que “leer” también los capítulos protagonizados por las rocas ígneas, que nos cuentan la historia de las erupciones volcánicas del pasado, y las metamórficas, que nos hablan de transformaciones de unas rocas en otras.

Así, poco a poco, reconstruimos la biografía del planeta, desde las variaciones ambientales a la evolución de la vida. Comprender esa historia nos permite entender los cambios que están ocurriendo hoy en día y reflexionar sobre nuestro breve capítulo como homínidos, ya que la Tierra seguirá transformándose más allá de nuestra presencia en ella.

Si miras a tu alrededor y te pones las gafas de geólogo o geóloga, descubrirás qué historias conservan las rocas para saber hacia dónde vamos.


La Cátedra de Cultura Científica de la Universidad del País Vasco colabora en la sección The Conversation Júnior.


The Conversation

Laura Damas Mollá no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.

ref. ¿Cómo podemos conocer la historia de la Tierra? – https://theconversation.com/como-podemos-conocer-la-historia-de-la-tierra-266404

Pautas para seguir una verdadera dieta mediterránea

Source: The Conversation – (in Spanish) – By Ana Belén Ropero Lara, Profesora Titular de Nutrición y Bromatología – Directora del proyecto BADALI, web de Nutrición. Instituto de Bioingeniería, Universidad Miguel Hernández

El pescado, las frutas y las hortalizas constituyen una parte esencial de la dieta mediterránea. monticello/Shutterstock

Inscrita en la Lista del Patrimonio Cultural Inmaterial de la Humanidad de la UNESCO desde 2013, la dieta mediterránea es una parte esencial de nuestra cultura que, además, produce efectos beneficiosos en la salud. Pero su poder protector no reside en los hábitos alimentarios actuales o en los productos manufacturados en la región mediterránea. En realidad, proviene del patrón dietético clásico cuyos efectos han sido validados por decenas de estudios científicos con la participación de miles de personas.

Entonces, ¿a qué podemos llamar dieta mediterránea?

Según la definición de la UNESCO, “comprende un conjunto de conocimientos, competencias prácticas, rituales, tradiciones y símbolos relacionados con los cultivos y cosechas agrícolas, la pesca y la cría de animales, y también con la forma de conservar, transformar, cocinar, compartir y consumir los alimentos”. Por lo tanto, constituye también un patrón social.

De todos modos, el componente fundamental de este patrimonio cultural es, como su nombre indica, la dieta propiamente dicha. Para saber cómo de mediterránea es nuestra alimentación, debemos tener en cuenta 14 puntos. A mayor puntuación, más se le parece.

En primer lugar, los alimentos recomendados que suman puntos son:

  • El aceite de oliva.

  • Las verduras y hortalizas.

  • Las frutas.

  • Las legumbres.

  • El pescado y el marisco.

  • Los frutos secos.

  • La carne blanca (pollo, pavo, conejo).

  • El tradicional sofrito para acompañar platos principales.

Y los productos a evitar, los que nos alejan de la dieta mediterránea, son:

  • Las carnes rojas y procesadas.

  • La mantequilla, margarina o nata.

  • Las bebidas carbonatadas, ya sean azucaradas o no.

  • La repostería comercial.

Beneficios probados para la salud

El primer estudio sobre la dieta mediterránea se publicó en 1970, pero tuvieron que pasar aún más de 20 años para que volviera a despertar interés. Desde entonces ha sido foco de intensa investigación en nutrición.

Uno de los principales trabajos es español y en él participaron más de 90 investigadores. Se trata del estudio PREDIMED, diseñado para evaluar los efectos de este patrón alimentario en personas mayores con alto riesgo de enfermedad cardiovascular. Las conclusiones fueron inapelables: la incidencia de eventos cardiovasculares era claramente menor en los grupos de dieta mediterránea.

Ahora, después de décadas de investigaciones, los resultados no dejan lugar a dudas: la dieta mediterránea es una firme aliada para nuestra salud. Seguirla reduce el riesgo de mortalidad y de sufrir tres de las enfermedades más frecuentes de nuestro tiempo: las dolencias cardiovasculares, el cáncer y la diabetes tipo 2. Además, también protege contra el deterioro cognitivo, la demencia y el alzhéimer.

Menús poco mediterráneos

A pesar de todas estas evidencias, la realidad es que nuestra alimentación actual se parece poco o nada a la dieta mediterránea. Además, solemos creer erróneamente que alimentos muy consumidos, como el cerdo, el jamón o el queso, forman parte de ella.

Si echamos un vistazo a los 14 puntos mencionados en el estudio PREDIMED, leemos que uno de ellos es: “¿consume usted preferentemente carne de pollo, pavo o conejo en vez de ternera, cerdo, hamburguesas o salchichas?”. Para obtener un punto, la respuesta debe ser sí. Además, una de las recomendaciones adicionales es no tomar más de una ración de jamón curado o de carne roja a la semana.

La principal razón para esta restricción es que tanto el jamón curado como cualquier otro embutido son carnes procesadas, y estas aumentan las probabilidades de sufrir cáncer colorrectal, el segundo cáncer más mortal del mundo. Otra razón es su alto contenido de sal, que incrementa el riesgo de enfermedades cardiovasculares y renales.

Además, en términos de salud, el cerdo se clasifica como carne roja, un riesgo adicional para contraer el citado cáncer colorrectal. Por si fuera poco, tomar mucha carne roja y procesada también aumenta las posibilidades de morir por problemas cardiovasculares.

En lo que se refiere a los quesos, los bajos en grasa no tienen límites estrictos en la dieta mediterránea, mientras que los curados o grasos (la gran mayoría) están restringidos a un máximo de una ración semanal. Esto se debe a su elevado contenido de sal y a que predominan las grasas saturadas, dos riesgos considerables para la salud.

¿Vino sí o vino no?

Uno de los aspectos más controvertidos de la dieta mediterránea es el vino. Aunque puntúa positivamente en el estudio PREDIMED, no se fomenta su consumo, sino que se limita la cantidad a quienes ya son consumidores habituales.

El vino, particularmente el tinto, goza de buena fama por la presencia de polifenoles procedentes de la uva. De hecho, en el propio estudio PREDIMED se observó que tomar más polifenoles puede disminuir el riesgo de mortalidad. Sin embargo, estos compuestos naturales también se encuentran en alimentos muy consumidos, como el aceite de oliva virgen, los frutos secos, las frutas y las verduras.

Además, el vino no deja de ser una bebida alcohólica y la conclusión de la Organización Mundial de la Salud al respecto es que beber alcohol puede causar más de 200 problemas de salud. De hecho, esta institución considera que no hay un nivel que no suponga un riesgo, por bajo que sea.




Leer más:
“Yo no bebo mucho, bebo lo normal”


Lo positivo es que siempre estamos a tiempo de mejorar nuestra salud. No importa la edad a la que lo hagamos: cualquier momento es bueno para acercar nuestra alimentación a los 14 puntos de la dieta mediterránea. Cada paso que demos hacia un mayor cumplimiento de este patrón alimentario, por pequeño que sea, supone un beneficio en la prevención de enfermedades.

The Conversation

Las personas firmantes no son asalariadas, ni consultoras, ni poseen acciones, ni reciben financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y han declarado carecer de vínculos relevantes más allá del cargo académico citado anteriormente.

ref. Pautas para seguir una verdadera dieta mediterránea – https://theconversation.com/pautas-para-seguir-una-verdadera-dieta-mediterranea-266678

First subpoenas issued as Donald Trump’s ‘grand conspiracy’ theory begins to take shape

Source: The Conversation – UK – By Robert Dover, Professor of Intelligence and National Security & Dean of Faculty, University of Hull

In recent weeks, Donald Trump’s supporters have begun to align around the idea that a Democrat-led “grand conspiracy” – potentially involving former president Barack Obama – has been plotting against the US president since 2016. The narrative is that the 2016 Russia investigation, which resulted in the Mueller inquiry was part of this deep-state opposition to Trump, as was the investigation into the January 6 riot at the US Capitol.

The focus of the fightback by Trump’s supporters is in Miami, where a Trump-appointed US attorney, Jason A. Reding Quiñones, has begun to issue subpoenas to a wide range of former officials.

This has included former CIA director John Brennan, former FBI counterintelligence official Peter Strzok, former FBI attorney Lisa Page and former director of national intelligence James Clapper, all of whom were involved in the federal investigation into alleged links between Russian intelligence and Trump’s 2016 presidential campaign.

The way the so-called conspiracy is unfolding will feel familiar to anyone who has watched US politics closely in the past decade. There’s been a constant stream of allegations and counter-allegations. But the narrative from the Trump camp is that the powerful “deep state” forces have been arrayed against the president. The “two-tier” justice system that has persecuted Trump can only be rebalanced by pursuing those who investigated him in 2017 and 2021.

The Grand Conspiracy contains similarities with other prominent conspiracy theories and how they spread. The QAnon movement, whose most famous claim is of a global paedophile ring run out of a Washington pizza parlour involving senior Democrats, is one where disparate claims are sporadically and partially evidenced. The political potency of these claims does not sit in the individual pieces of evidence but in the overarching story.

The story is that hidden government and proxy networks manipulate the truth and judicial outcomes and that only through pressure from “truthers” (what many people in the US who believe conspiracy theories call themselves) will wrongdoers be brought to account. Once these ideas are popularised, they take on a momentum and a direction that is difficult to control.

Campaign of ‘lawfare’

Soon after his inauguration, Trump set up a “weaponization working group” within the Department of Justice. Its director, Ed Martin, said in May that he would expose and discredit people he believes to be guilty, even if the evidence wasn’t sufficient to charge them: “If they can be charged, we’ll charge them. But if they can’t be charged, we will name them. And we will name them, and in a culture that respects shame, they should be people that are ashamed.”

In the US the norm has been to “charge crimes, not people”, so this modification fundamentally changes the focus of prosecutors.

Former FBI director James Comey responds to his indictment by grand jury in September.

The recent subpoenas in Florida show this principle at work, effectively making legal process into the punishment. Even without full court hearings on specific charges, being forced to provide testimony or documents creates suspicion around those who are targeted. Criticism from legal officials that this is a “indict first, investigate second” method suggests that this is a break from historical norms.

Lawfare, defined as “legal action undertaken as part of a hostile campaign”, doesn’t require a successful prosecution. It merely requires enough investigative activity to solidify a narrative of suspected guilt and enough costs and pressure to seriously inconvenience those affected by it. In the new era of digital media, it’s enough to degrade the standing of a political opponent.

In that way, political retaliation has become a prosecuting objective. This is clear from what the US president has indicated in his frequent posts on his social media platforms for his enemies, such as former FBI director James Comey, who investigated his alleged links to Russia, or Adam Schiff, the senator who led his impeachment in 2019.

Hardball politics or authoritarianism?

Political scientists argue that authoritarianism is something that happens little by little. Some of these steps involve using state power to target political opponents, degrading checks and balances and making loyalty a legal requirement.

There are reasons to believe that the US seems to be tracking this trajectory currently, certainly when it comes to using the Justice Department to harass the president’s political enemies and pushing back against court judgments while attacking the judges that have issued them.

Further slides towards authoritarianism are possible because of the political potency of contemporary conspiracy movements. The right-wing QAnon movement, for example, has been exceptionally agile. It has offered its followers identity, community spaces and a logic that encourages active participation, exhorting believers to “do your own research”, for example.

In the wake of the near daily addition of material from the investigations into the allegations that the late financier, Jeffrey Epstein, ran a sex trafficking ring, involving some influential US citizens, many American citizens have concluded as a general truth that their elites do hide things. This makes it far simpler for broader conspiracies to gain traction and more difficult for politicians and journalists to work out what is conspiracy and what is evidence. This is creating a problematic feedback loop – hints of wrongdoing fuel public suspicion, and public suspicion fuels the idea of a further need for investigation.

But to suggest that anyone has control over this would be wrong. These movements can just as easily consume those seen as supporters as they do those seen as enemies. Marjorie Taylor-Greene’s determination to release the full and unredacted Epstein files could well produce negative outcomes for some Maga supporters, including prominent ones.

So, the transformation of legal process into public spectacle in America is suggestive of a drift towards authoritarianism. America’s famous “constitutional guardrails” of separation of powers, independent courts, juries and counsels will be pivotal in preventing this. They will need to stand firm.

The grand conspiracy theory might be more about seeking to isolate, and financially and emotionally exhaust opponents, while at the same time destroying America’s system of checks and balances. It might work.

The Conversation

Robert Dover does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. First subpoenas issued as Donald Trump’s ‘grand conspiracy’ theory begins to take shape – https://theconversation.com/first-subpoenas-issued-as-donald-trumps-grand-conspiracy-theory-begins-to-take-shape-269542

Flu season has started early in the UK – here’s what might be going on

Source: The Conversation – UK – By Conor Meehan, Associate Professor of Microbial Bioinformatics, Nottingham Trent University

The UK’s flu season is already well underway. simona pilolla 2/ Shutterstock

Flu season has got off to an early start this year in the UK – with cases spiking weeks earlier than in previous years. This has led to concerns that the UK may be on track for one of its worst flu seasons ever.

In the UK and other northern hemisphere countries, flu season tends to run from mid-November to mid-February. In the southern hemisphere, it runs from May to July.

It’s hard to know the exact number of flu cases the UK is currently seeing as most people don’t report when they have the flu. Most just stay in bed and recover. To get a picture of this year’s flu season, we rely on hospital data and GP reports. This usually only represents the most severe flu cases.

We know flu season is “starting” when about 10% of suspected cases come back positive for the influenza virus.

The UK’s flu season is already well underway – and weeks before it usually starts. This is because at the start of November we were already seeing 11% of daily tests come back positive for the flu. At the same time last year, just 3% of tests were positive. The UK crossed the 10% threshold a whole month earlier than it did last year.

School-aged children are currently most affected, with 38% of tests coming back positive for the flu – up from 30% just one week prior. Around this time last year, the number of children testing positive for flu was just under 7%.

A line graph depicting flu seasons starting from 2022 and going until this year, 2025.
Cases have spiked a month earlier than usual.
UK Health Security Agency

Similar increases have been seen elsewhere, such as in Japan and across Europe.

What’s causing this early flu season?

The UK’s flu vaccine uptake seems to be almost identical to previous years, so the increase in cases cannot be explained by a fall in vaccination rates.

One likely factor contributing to the UK’s early spike in flu cases is the strain of influenza virus that’s circulating.

Flu is caused by influenza viruses – mainly the influenza A virus. There are lots of variants of this virus, so they’re usually designated by a combination of H and N numbers. For example, H5N1 is the main cause of the ongoing avian flu pandemic in birds and other animals. Seasonal flu in humans is usually caused by H3N2 and H1N1.

The seasonal flu vaccine is designed to combat these two strains, as well as an influenza B virus alongside them. This vaccine tends to be between 20-70% effective at preventing the flu, depending on the year. The vaccine tends to be most effective for school-aged children, especially in preventing severe forms of the disease.

A new vaccine is developed every year as the circulating strains of influenza can mutate over time, reducing vaccine efficacy.

Twice a year (once for each hemisphere), the World Health Organization convenes an expert panel to decide, based on the strains that circulated last year, what strains of influenza should be used to build the vaccine for the coming flu season. The vaccine almost always includes an H1N1, H3N2 and influenza B strain.

Generally, building these vaccines based on what circulated previously is quite effective. This is because any genetic changes that occur in these strains between flu seasons aren’t large enough to render the vaccine ineffective.

But this year there seems to have been an exception. A new strain of influenza, influenza A H3N2 subclade K, is now infecting the majority of people. This strain has seven mutations that differentiate it from the previous H3N2 strain. This is many more genetic mutations than what’s usually seen between seasons.

It’s too early to know why this strain has developed so many genetic mutations. But we do know that these changes appear to have made this strain slightly more transmissible compared to previous strains.

The strain’s R number (the average number of people an infected person will go on to infect) increased from the usual 1.2 for influenza to 1.4. This means about 20% more people will be infected than we would normally expect.

Early research into this strain shows that the vaccine is still very effective in children at preventing severe forms of the disease. But in adults, effectiveness has dropped to between 30% and 40%.

A mother checks her child's temperature with a thermometer while resting her hand on the child's head. The girl is blowing her nose with a tissue.
School-aged children are currently most affected by this season’s flu.
Prostock-studio/ Shutterstock

However, we can’t say just yet whether reduced vaccine efficacy in adults and the new mutations to the H3N2 strain are the causes behind the current spike in flu cases.

It’s also too soon to know whether this year’s flu season will be more severe than in previous years. But based on its early start, the strain’s high R number and low vaccine effectiveness in adults, we might expect higher numbers than usual.

And, if we look at data from from southern hemisphere’s flu season – which usually gives us a good idea of what we should expect – Australia saw its worst flu season ever. They reported 10% more cases than in the previous year.

How to protect yourself

It’s important to note that, especially in children, the vaccine is still the best form of protection. Flu can be very severe in both the young and old, resulting in hospitalisation and sometimes death. Vaccination (including by those who regularly come in close contact with older and younger people) is key.

It’s also important to know how flu symptoms differ from those of the common cold so that you can recover and protect others from catching it. The presence of fever, headache and a strong cough typically indicate the flu.

If you have these symptoms, you should rest and follow standard flu guidance. Also remember you’re infectious for a week or so after symptoms start, so isolating at this time will stop the virus from spreading. Alongside getting the jab, wearing a mask and following good hand hygiene can help you avoid getting sick and prevent you from spreading the flu if you are sick.


If you’ve got a question about the flu vaccine that you’d like an expert to answer, please send them to: clint.witchalls@theconversation.com

The Conversation

Conor Meehan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Flu season has started early in the UK – here’s what might be going on – https://theconversation.com/flu-season-has-started-early-in-the-uk-heres-what-might-be-going-on-269619

Why musicians are leaving Spotify – and what it means for the music you love

Source: The Conversation – UK – By Andrew White, Senior Lecturer in the Department of Culture, Media & Creative Industries, King’s College London

Vera Harly/Shutterstock

Spotify is haemorrhaging artists. In the last few months alone a handful of indie bands have exited the streaming platform. If that includes some of your favourite musicians, you may be wondering how best to support them.

Among the artists leaving the platform is indie band Deerhoof. They reacted to the news that Spotify’s founder Daniel Ek had used his venture capital firm to lead a €600 million (£528 million) investment in Helsing, a German defence company specialising in AI. Their statement said: “We don’t want our music killing people.”

This sentiment chimes with the attitudes of the many listeners who cancelled their Spotify subscriptions after the platform ran recruitment ads for ICE, the US’s controversial Immigration and Customs Enforcement agency.

The exodus reflects a general concern that major tech companies are too cosy with the Trump administration. Spotify’s US$150,000 (£114,000) donation to Trump’s inauguration ceremony was cited by Canadian musician Chad VanGaalen as one of the reasons for his departure from the platform.

But these protests are as much driven by a recognition of ongoing structural problems with music streaming business models as they are with recent events. Music streaming platforms like Spotify, Amazon Music and Apple Music allocate revenue to artists on a pro-rata basis. This means that artists on each platform are entitled to a proportion of the overall revenue from streaming. This percentage is calculated by identifying the proportion of their streams that represent the total number of streams on the platform.


No one’s 20s and 30s look the same. You might be saving for a mortgage or just struggling to pay rent. You could be swiping dating apps, or trying to understand childcare. No matter your current challenges, our Quarter Life series has articles to share in the group chat, or just to remind you that you’re not alone.

Read more from Quarter Life:


There is therefore no direct financial relationship between listeners and the artists that they listen to. This is an opaque structure that fuels musicians’ sense that they are not receiving fair remuneration.

The number of songs on Spotify and similar platforms has grown exponentially in recent years. By Spotify’s own admission, the growth in revenue from music streaming has resulted in a deluge of AI-generated content, with 75 million spam tracks being removed over 12 months in 2024-25.

Despite this success, it can be assumed that many such tracks remain undetected and that there are therefore significant amounts of money being given to fake musicians at the expense of real artists. Spotify’s openness to some AI content, exemplified by the continuing presence of the AI band Velvet Sundown in its catalogue, does not assuage artists’ concerns.

The bundling of different types of content can make the allocation of payments to musicians much more complicated. While Spotify’s music and podcast revenue streams are separate, its audiobooks have been bundled into its premium subscription. The effect of this change in 2024 has been to lower the royalty rate of the songwriters whose music appears on its platform. Around the same time the company decided to remove payments for songs that were streamed less than 1,000 times. This is likely to disproportionately affect artists struggling to get a foothold in the music industry.

Despite all this, overall revenue continues to grow. Spotify claims that the US$10 billion it paid to the music industry in 2024 was the largest ever annual payment by any retailer. Annual rises in the price point of its subscription in the last two years means that its growth will likely sustain. That its latest quarterly figures revealed an operating profit of US$680 million seem to bear this out. This improvement in Spotify’s finances exacerbates musicians’ feeling that they are not getting their fair share.

Where to go next

So where can you go if you decide to leave Spotify? Given that its main competitors also use the pro-rata payment model and offer the same menu of unlimited music, then probably not to them.

Some streamers have experimented with user-centric models of payment whereby listeners pay directly the artists of the songs they stream. This, though, has had limited success, with Deezer capping its scheme to 1,000 streams per person per month, while Tidal ended its own experiment after two years.

There are, though, smaller platforms which deploy user-centric models of payment. Sonstream was popular for a while with independent artists, but at the time of writing its website has only basic functionality.

Resonate is a cooperative with a pay-for-play user-centric model which gives artists and rights-holders 70% of revenue, with the remaining 30% being ploughed back into the business. But the one that appears to come closest to combining an “artists-first approach” with a critical mass of musicians and listeners is Bandcamp. Each time a user purchases something on the platform, 82% of that transaction goes to the artist and/or their label. These payments have amounted to US$1.6 billion to date for not only streamed music, but cassettes, CDs, vinyl records and t-shirts too.

This last observation reflects a wider trend within the music industry and among listeners. That is that the encroachment of algorithms and AI on the curation and listening of music has led many to ditch streaming platforms altogether. This has encouraged artists to be more innovative, with many experimenting with other means of distributing their music, including selling CDs and downloads directly, and setting up their own DIY digital platforms.

For Spotify and other streaming platforms there is then a wider existential question about the extent to which it is possible to construct an economically viable business model that satisfies listeners while ensuring that musicians receive fair remuneration for their creativity.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Andrew White does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why musicians are leaving Spotify – and what it means for the music you love – https://theconversation.com/why-musicians-are-leaving-spotify-and-what-it-means-for-the-music-you-love-269231

After resignations at the top, the BBC faces a defining test: what does impartiality mean now?

Source: The Conversation – UK – By Tom Felle, Associate Professor of Journalism, University of Galway

Taljat David/Shutterstock

The sudden departure of the BBC’s director general and head of news marks a moment of real consequence for British public service broadcasting.

Tim Davie and Deborah Turness’s resignations followed controversy over an inaccurately edited clip in a BBC Panorama documentary about Donald Trump. Opponents of the BBC seized on this as further evidence of widespread bias at the broadcaster. It has now become a flashpoint in the wider political and cultural battles surrounding the corporation.

The resignations come as the BBC enters a decisive period. The renewal of its royal charter in 2027 will define the corporation’s funding model and public purpose for the next decade. At the same time, the BBC faces a hostile political climate, sustained financial pressure and a rapidly fragmenting audience.

Recent controversies – from the Panorama edit to earlier disputes over social media conduct and political coverage – have reignited debate about the broadcaster’s duty of “impartiality”. Yet in today’s febrile information climate, it is fair to ask whether that duty remains fit for purpose.

Media regulator Ofcom defines impartiality as “not favouring one side over another”, but also as ensuring “due weight” is given to the evidence. That distinction matters: impartiality is not the same as neutrality. It demands that news be fair, accurate and proportionate – not that every claim be treated as equal.

Impartiality under pressure

The BBC’s crisis, as academic and commentator Adrian Monck observes, is not simply a matter of poor governance, but “the sinking of a ship of the twentieth century British state, dependent on conditions that no longer really exist”.

Impartiality as a professional norm took shape in the mid-20th century, when it became central to the BBC’s mission under its 1947 royal charter. It emerged in a period when there was still broad agreement on shared facts, and a civic space where citizens could reason together even when they disagreed.




Read more:
BBC has survived allegations of political bias before – but the latest crisis comes at a pivotal moment


This era has broken down over the past 20 to 25 years, with the rise of digital platforms and populist politics that eroded traditional journalistic gatekeeping. Today’s information environment is shaped by technology companies, populist leaders, political strategists and partisan media outlets. All have strong incentives to create confusion and distrust. When political figures deny evidence, distort facts or lie as strategy, reporting their claims as equal to verified facts is not neutrality or impartiality, it is distortion.

Davie understood this tension. Under his leadership, the BBC tried to clarify the meaning of impartiality, strengthen editorial standards and reinforce trust in its reporting.

Yet the organisation, like many news outlets worldwide, is caught in a bind: accused of bias from both the left and the right – and while in the past this might suggest a fair balance, in today’s climate it is often weaponised.

As the sociologist Niklas Luhmann has noted, the function of news is to create a shared reality, a minimal consensus about what exists and what matters. When that consensus collapses, the public sphere itself begins to fragment and journalism loses the ground on which democratic discourse depends.

Younger audiences, who are more likely to access news mediated through influencers they perceive as authentic and relatable, are less engaged with traditional news brands. A Reuters Institute study found that young people increasingly turn to personalities rather than established outlets, or avoid news altogether because they see it as untrustworthy or biased.

The broader global trend is unmistakable. Public service broadcasters in the US, Australia, Canada and across Europe are facing declining audiences, reduced funding, politicised attacks and competition from platforms that prioritise outrage and identity performance. The BBC is not unique in this struggle, but because of its scale and cultural importance, the stakes are higher.

Public service media under siege

The BBC is imperfect. It suffers from institutional caution, uneven performance and a reluctance at times to confront its own errors. Yet it remains one of the few media organisations in the world still committed to verification rather than performance.

Its public service mandate, however strained, is one of the last structural defences against the current media culture: one dominated by outrage merchants and ideological broadcasters whose business model is provocation rather than truth.

Once a public sphere is shaped primarily by rumour and outrage, it becomes almost impossible to restore a shared sense of reality. The alternative is visible already in GB News, Fox and Breitbart, where conflict and grievance have displaced evidence.




Read more:
Perfect storm of tech bros, foreign interference and disinformation is an urgent threat to press freedom


The question now is not whether the BBC should continue to defend impartiality, but which version of impartiality it intends to defend. If impartiality means placing all claims side by side regardless of evidential grounding, it becomes a mechanism for laundering falsehood into public discourse. But if it means rigorous truth-telling, proportionate scrutiny and transparency about what we know and how we know it, then it remains both viable and essential.

BBC chair Samir Shah has apologised for the Panorama edit, describing it as an “error of judgement”. But it has exposed how fragile impartiality has become as both a principle and a perception. In an environment where trust is brittle, even minor lapses are magnified into institutional political positions. Impartiality is now judged as much by perception as by practice.

The resignations at the top of the BBC make this moment all the more precarious. The next leadership will determine whether the BBC becomes a smaller, defensive organisation that avoids offence, or a confident public service broadcaster that accepts that truth-telling will sometimes be mistaken for taking sides. Only the latter approach offers any chance of sustaining public relevance.


Want more politics coverage from academic experts? Every week, we bring you informed analysis of developments in government and fact check the claims being made.

Sign up for our weekly politics newsletter, delivered every Friday.


The Conversation

Tom Felle does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. After resignations at the top, the BBC faces a defining test: what does impartiality mean now? – https://theconversation.com/after-resignations-at-the-top-the-bbc-faces-a-defining-test-what-does-impartiality-mean-now-269575

Kyiv’s European allies debate ways of keeping the cash flowing to Ukraine but the picture on the battlefield is grim

Source: The Conversation – UK – By Veronika Hinman, Deputy Director, Portsmouth Military Education Team, University of Portsmouth

The EU is considering a range of options as it tries to work out how to continue to fund Ukraine’s defence against Russia. There are three mechanisms presently under consideration. One is using Russia’s frozen assets to back a loan of €140 billion (£124 billion). Another is borrowing the money at interest, although this is not popular.

The third idea, which was proposed by Norwegian economists, is that Norway could use its €1.8 trillion sovereign wealth fund – the biggest in the world – to guarantee the loan. Their reasoning was that Norway, Europe’s biggest producer of oil and gas, has made an extra €109 billion from the rise in gas prices after Russia’s invasion.

The situation on the front has been largely static for months, although Russian forces have been making small gains in some key areas. The battles for the strategically important cities of Pokrovsk in the Donetsk region of eastern Ukraine and Huliaipole in the southern region of Zaporizhzhia are a good indication of the progress of the war in general.

It’s hard, amid the flood of disinformation, to accurately monitor from a distance the exact status of these two important battles. Each day brings fresh reports of multiple attacks and advances by Russian troops. There have also been reports that Russian units have captured Pokrovsk. This would be a serious blow for Ukraine, as it’s an important supply hub, with several roads and rail lines converging there.

But the US-based military think-tank the Institute for the Study of War (ISW), which uses geolocated footage on which to base its assessments, has determined that Russia is not yet in full control of Pokrovsk, having to date seized 46% of the city. ISW analysts say Russian military bloggers are “mounting a concerted informational campaign prematurely calling the fall of Pokrovsk, likely to influence the information space”.

The battle for Pokrovsk has raged for nearly 18 months now, without resolution – but with huge casualties on both sides.

Similarly, while the situation in Huliaipole is deteriorating for the Ukrainian defenders, “Russian forces will probably spend considerable time setting conditions for efforts to seize the settlement”, the ISW says.

It’s important to realise that Russian troops initially entered Huliaipole on March 5 2022 within weeks of its initial invasion the previous month, but were quickly pushed back by Ukrainian troops. Fighting has continued in the region ever since.

In other words while both sides have made some tactical gains, neither holds the strategic upper hand.

One thing is clear: despite the claims and counter-claims, both sides have suffered significant casualties. In June 2025, the UK Ministry of Defence estimated more than one million Russian troops have been killed or injured since the invasion in February 2022. But Russia still retains considerable reserves of troops to call on, and has not yet had to resort to full mobilisation.

Meanwhile Russia’s economy is holding up, despite western sanctions. The effect of the recent imposition of oil sanctions by the US has yet to be seen. At the same time, Russia’s continuing and thriving diplomatic, economic and military relationships with its “enabler ally” China, as well as others on the anti-west axis such as Iran and North Korea – which have been supplying Moscow with weaponry and troops, respectively – is helping it sustain its offensive efforts.

ISW map showing the state of the conflict in Ukraine, November 11 2025.
The state of the conflict in Ukraine, November 11 2025.
Institute for the Study of War

Financing Ukraine’s defence

Ukraine, meanwhile, is now almost entirely reliant on continued western support. Since Donald Trump took power in the US in January, the US stance towards Ukraine has shifted considerably and while Kyiv’s friends in Nato can continue to purchase US weaponry for Ukraine’s war effort, the US will not fund any of the purchases. Consequently, military aid to Ukraine has slowed considerably in the second half of 2025 – by up to 43% according to German research non-profit the Kiel Institute.

EU leaders voted in October to meet Ukraine’s “pressing financial needs” for another two years, but have yet to agree on a way of doing that. Using frozen Russian assets comes with a number of difficulties. These assets are held in Belgium by the securities depository Euroclear. But Brussels is wary of the move, arguing that a Russian lawsuit against the move, if successful, could leave Belgium liable.

The other obstacle is that it would need to be unanimously approved by EU member states, something that is thought highly unlikely. The idea of using frozen Russian assets has already been rejected by Hungary and Slovakia. And the recent victory of the populist ANO party in the Czech Republic could signal further isolation for Ukraine. One of the first gestures made by the new Czech government has been to remove the Ukrainian flag from the parliament building.

If Norway were willing to use its US$2 trillion sovereign wealth fund to guarantee a €160 billion loan to Ukraine, it would effectively bypass the need for EU unanimity. But the country’s finance minister, former Nato secretary-general Jens Stoltenberg, appeared to rule that out on November 12 when he said guaranteeing the whole amount was “not an option”.

What impact is this loan likely to make in the grand scheme of things? The funds supplied thus far have kept Ukraine from defeat, but have not enabled it to strike a decisive blow against Russia that would win the war or enable it to negotiate a just peace.

At the same time it is realistic to acknowledge that while a massive injection of funds would help Ukraine stabilise its economy and buy enough arms to give their troops a better chance on the battlefield, it cannot deliver the manpower, weapons or morale. In the end, this latest wave of aid may buy Ukraine time – but it’s unlikely to deliver victory.

The Conversation

Veronika Hinman does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Kyiv’s European allies debate ways of keeping the cash flowing to Ukraine but the picture on the battlefield is grim – https://theconversation.com/kyivs-european-allies-debate-ways-of-keeping-the-cash-flowing-to-ukraine-but-the-picture-on-the-battlefield-is-grim-269541

Early climate models got global warming right – but now US funding cuts threaten the future of climate science data

Source: The Conversation – UK – By Gemma Ware, Host, The Conversation Weekly Podcast, The Conversation

bear_productions/Shutterstock

Since the 1960s, scientists have been developing and honing models to understand how the earth’s climate is changing. These models help predict the phenomena that accompany that change, such as stronger storms, rising sea levels and warming temperatures.

One such pioneer of early climate modelling is Syukuro Manabe, who won the Nobel prize in physics in 2021 for his work laying the foundation for our current understanding of how carbon dioxide affects global temperatures. That same year, a seminal paper he co-published in 1967 was voted the most influential climate science paper of all time.

Syukuro Manabe pointing to a chart.
Syukuro Manabe at the Geophysical Fluid Dynamics Laboratory.
Geophysical Fluid Dynamics Laboratory

In this episode of The Conversation Weekly podcast,  we speak to Nadir Jeevanjee, a researcher at the same lab in the National Oceanic and Atmospheric Administration where Manabe once worked. He looks back at the history of these early climate models, and how many of their major predictions have stood the test of time.

“ On one hand, we’ve gone way beyond Manabe in the decades since,” says Jeevanjee. “And on the other hand, some of those insights were so deep that we keep coming back to them to deepening our understanding.”

And yet, as climate negotiators gather in the Brazilian city of Belem on the edge of the Amazon for the Cop30 climate summit to hammer out new pledges on reducing carbon emissions and how to pay for climate adaptation, the data sources that climate scientists around the world rely on to monitor and model the climate are under threat from funding cuts by the Trump administration.

“We all do this work because we believe in its importance,” says Jevanjee. “And so the idea that the work isn’t necessarily valued by the present government, or that we wouldn’t be able to do it, or that somehow our lab and the models that it produces and all the science that comes out of it will be curtailed or shut, is alarming.”

Listen to the interview with Nadir Jeevanjee on The Conversation Weekly podcast, and read an article he wrote about five forecasts that early climate models by Suki Manabe and his colleagues got right.

This episode of The Conversation Weekly was written and produced by Katie Flood, Mend Mariwany and Gemma Ware. Mixing by Eleanor Brezzi and theme music by Neeta Sarl.

Newsclips in this episode from CNN.

Listen to The Conversation Weekly via any of the apps listed above, download it directly via our RSS feed or find out how else to listen here. A transcript of this episode is available via the Apple Podcasts or Spotify apps.

The Conversation

Nadir Jeevanjee works for NOAA’s Geophysical Fluid Dynamics Laboratory, which is discussed in this podcast episode. The views expressed herein are in no sense official positions of the Geophysical Fluid Dynamics Laboratory, the National Oceanic and Atmospheric Administration, or the Department of Commerce.

ref. Early climate models got global warming right – but now US funding cuts threaten the future of climate science data – https://theconversation.com/early-climate-models-got-global-warming-right-but-now-us-funding-cuts-threaten-the-future-of-climate-science-data-269639