Ya en la antigua Roma Séneca clamaba contra los turistas

Source: The Conversation – (in Spanish) – By Freya Higgins-Desbiolles, Adjunct professor and adjunct senior lecturer in tourism management, University of South Australia

Este caluroso verano europeo, las protestas contra el turismo han acaparado los titulares de la prensa, desde Barcelona hasta Venecia, pasando por Mallorca y las islas Canarias. Sin embargo, los disturbios no se limitan a Europa.

Hace unas semanas, en Ciudad de México, varias manifestaciones pacíficas contra el turismo excesivo y la gentrificación por parte de los “nómadas digitales” extranjeros acabaron en violencia cuando un pequeño grupo de participantes destrozó escaparates y saqueó tiendas.

Y a principios de este año, la oficina de turismo de Japón instó a los australianos a cambiar sus viajes a Tokio y Kioto (donde se ha acusado a los turistas de acosar a las geishas) por destinos menos transitados. Los turistas han sido criticados por su mal comportamiento en la Antártida y Bali (donde el turismo representa entre el 60 % y el 70 % de su producto interior bruto).

Aunque el malestar por el exceso de turismo en Europa se remonta al menos a 2017, este año marca un hito: por primera vez, los activistas de todo el continente han coordinado sus protestas. Los lugareños han recurrido a grafitis contra el turismo en Atenas, ataques con pistolas de agua en Italia, Portugal y España, y una marcha acuática contra los cruceros en Venecia. Hasta el punto de que se han emitido advertencias de seguridad para los viajeros que vayan a Europa durante la temporada estival.

Las quejas más habituales se refieren al hacinamiento, la inaccesibilidad de la vivienda y los daños al entorno físico y natural. En otras partes del mundo también preocupan el desequilibrio de las políticas turísticas, la insensibilidad de los visitantes y la especulación inmobiliaria.

Pero las protestas locales contra el turismo no son nuevas. Tienen una larga historia: desde la antigua Roma y el Brighton del siglo XIX hasta Hawái y el Caribe tras el auge del turismo de masas en la década de 1950.

La antigua Roma y el Brighton del siglo XIX

El rechazo a los turistas se remonta a los inicios de las “escapadas”. En el año 51, el filósofo Séneca escribió sobre quienes huían de Roma para ir a la playa:

“¿Por qué tengo que ver a borrachos tambaleándose por la orilla o ruidosas fiestas en barcas […]? ¿Quién quiere escuchar las disputas de los cantantes nocturnos?”.

Esto podría haberlo dicho un lugareño que sufre los excesos alcohólicos del “turismo de despedidas de soltero” en la Ámsterdam actual. El choque cultural entre la vida de los habitantes, centrada en el trabajo y la familia, y el espíritu “despreocupado” de los visitantes es atemporal.

Incluso en la antigua Roma, los lugareños se quejaban de los turistas ruidosos en la playa.
Shutterstock

Los cimientos modernos del turismo actual se establecieron en el siglo XIX, en el Reino Unido. Entre ellos se encontraban la agencia de viajes creada por Thomas Cook, el desarrollo del ferrocarril y los barcos de vapor, y una cultura basada en lo que se conocía como el Gran Tour europeo.

Las protestas y el sentimiento antiturístico se desarrollaron rápidamente.

En el Reino Unido, por ejemplo, los ricos comenzaron a pasar sus vacaciones en la costa. Se construyeron centros turísticos para atenderlos, pero la irrupción de estos recién llegados afectó a menudo la vida de los residentes.

Los disturbios de Brighton de 1827 marcaron uno de los primeros enfrentamientos. Después de que los turistas se quejaran de las redes de pesca que ocupaban la playa y de la presencia hosca de los pescadores, los barcos pesqueros fueron retirados de la costa. Las protestas fueron reprimidas, los barcos desplazados de la playa principal de la ciudad y la sensibilidad de los turistas apaciguada.

Los disturbios contra el turismo en Brighton, Reino Unido, en 1827 protestaron por la retirada de los barcos pesqueros de la costa debido a las quejas de los turistas.
Detroit Publishing Co/Library of Congress/Wikipedia

En la década de 1880, varias protestas tuvieron como objetivo impedir que llegasen trenes llenos de visitantes al pintoresco Distrito de los Lagos del Reino Unido. “Las estúpidas manadas de turistas modernos se dejan vaciar como carbón de un saco en Windermere y Keswick”, escribió el filósofo John Ruskin. Los manifestantes lograron al menos una victoria temporal.

Cruceros, parques temáticos y ‘aloha marketing

Sin embargo, desde la Segunda Guerra Mundial, la catalizadora de las protestas fue la “masificación” del turismo como consecuencia de una industria globalizada y comercializada, cuyos símbolos fueron los cruceros, los aviones jumbo y los grandes parques temáticos.

El turismo de masas fue el resultado del crecimiento de las clases medias, a las que se concedieron vacaciones pagadas. Los sistemas de transporte hicieron que el turismo fuera más barato, más accesible y más amplio. Se desarrolló una cultura en la que ciertos segmentos de la población mundial comenzaron a considerar las vacaciones frecuentes como un derecho, en lugar de un privilegio excepcional.

El libro The Golden Hordes incluye un capítulo titulado “Paradise Rejected” (El paraíso rechazado). En él se documenta el sentimiento antiturístico local desde el Caribe hasta Hawái y Europa. Los autores, Louis Turner y John Ash, relatan violentos incidentes antiturísticos ocurridos en la década de 1970 en lugares como Jamaica.

Los gobiernos solían promocionarse a nivel nacional con “campañas de la sonrisa” para buscar que los turistas considerasen sus naciones como posibles destinos. Esto sucedía mientras muchos de estos países se estaban descolonizando y trazando caminos hacia la independencia.

Los indígenas Kanaka Ma’oli de Hawái llevan décadas protestando, según se ha ido desarrollando la industria. Además, en Hawái el turismo se ha basado en parte en el abuso de su cultura, especialmente en la comercialización del “aloha”, idealizando su forma de vida de forma estereotipada para atraer las fantasías exóticas de los viajeros.

Muchas de las protestas de Hawái tienen lugar en las playas, donde los lugareños informan a los visitantes del contexto político y la crisis de la vivienda provocada por el turismo. A partir de 2004, algunos activistas locales comenzaron a crear “desvíos” para los viajeros, con el fin de compartir con ellos las opiniones de los habitantes locales y contar historias alejadas de la narrativa comercial.

Recientemente, a raíz de la pronta reapertura del turismo tras los incendios de Maui de 2023, los hawaianos decidieron protestar con una “pesca” masiva. Una coalición organizó a los lugareños para colocarse con cañas y aperos frente a los complejos turísticos de la playa de Kaanapali, con el fin de llamar la atención sobre la falta de viviendas permanentes para los residentes y la lentitud de la recuperación tras el desastre.

Este es un claro ejemplo de turistificación, en el que los residentes sienten que se da prioridad al éxito turístico por encima del bienestar local.

Esta época también está siendo testigo de la competencia entre los gobiernos por albergar megaeventos deportivos, en parte por los beneficios turísticos derivados de ellos. Las ciudades brasileñas vivieron varias manifestaciones en protesta por los enormes costes que supuso la celebración de la Copa Mundial de Fútbol de 2014, que fueron reprimidas por los antidisturbios.

Las protestas podrían pronto dar paso a estrategias comunitarias más integrales. Se están organizando movimientos sociales contra el turismo excesivo y la turistificación. Por ejemplo, recientemente se celebró en Barcelona un congreso, convocado por la red global Stay Grounded, que reunió a participantes de toda Europa para crear coaliciones con el fin de empoderar a las comunidades.

Mirando atrás, “antiturismo” podría ser un término erróneo. Los locales no están necesariamente en contra de los turistas ni del turismo. Están en contra de los visitantes irrespetuosos, de una industria impulsada por el crecimiento a cualquier precio y de los gobiernos que no gestionan de forma eficaz en interés de sus residentes locales.

Desde hace mucho tiempo, está claro que tenemos que mejorar, y las comunidades locales, hartas, están tomando cartas en el asunto.

The Conversation

Freya Higgins-Desbiolles fue cofundadora y participante del Tourism Alert and Action Forum, una red global que defiende los derechos de las comunidades en el turismo (actualmente inactiva). También ha participado en organizaciones de turismo responsable y ético, como la Responsible Tourism Network de Australia (ya desaparecida) y la organización Equality in Tourism (ya no afiliada), y ha colaborado con el Alternative Tourism Group of Palestine.

ref. Ya en la antigua Roma Séneca clamaba contra los turistas – https://theconversation.com/ya-en-la-antigua-roma-seneca-clamaba-contra-los-turistas-262530

No existe una varita mágica para eliminar todo el plástico del planeta

Source: The Conversation – (in Spanish) – By Jordi Diaz Marcos, Profesor departamento materiales y microscopista , Universitat de Barcelona

Aleksandr Grechanyuk/Shutterstock

En apenas 70 años, hemos pasado de producir dos toneladas de plástico al año (en 1950) a más de cuatrocientas (en 2022). Además, estas cifras se han acelerado en el siglo XXI. Desde el año 2000, se ha fabricado más de la mitad de la cantidad total de plástico existente. Si seguimos esta progresión, se espera que para el 2050 la producción se acerque a los 1 500 millones de toneladas.

De esta enorme cantidad total de plásticos, se recicla menos de un 20 %. Está claro que a los grandes beneficios del plástico les acompaña una terrible mochila: la contaminación ambiental asociada, que cada día es más grande.

¿Y si con una varita pudiéramos eliminar los plásticos?

A pesar de que existe un amplio debate crítico sobre los plásticos, si nos queremos plantear de forma seria, sin demagogia, su sustitución, hemos de implicar a otros materiales como el vidrio, el metal, la madera o la cerámica. Estas alternativas, aunque útiles, presentan desafíos significativos.

En primer lugar, son materiales más pesados, lo que implica costes energéticos más altos. Por ejemplo, una botella de vidrio de un litro puede pesar hasta veinte veces más que una de igual capacidad de plástico.

Por otra parte, ¿qué sería de la deforestación de los bosques si la madera sustituyera a los plásticos y se utilizara de forma masiva? ¿Qué residuos se generarían por la producción masiva de vidrios y metales? Además, producirían residuos de difícil postprocesado y reciclado. La suma de todos estos factores implicaría un impacto muy pernicioso para nuestro planeta.

El caso de los hospitales

Los plásticos han transformado de forma irreversible nuestras vidas, pero su ausencia cambiaría radicalmente nuestra sociedad. Es, por lo tanto, crucial su uso responsable y el desarrollo de alternativas sostenibles para asegurar un futuro más limpio y saludable.

Así, sectores como la medicina o la automoción han evolucionado de una manera exponencial gracias al desarrollo de los plásticos. A todos los críticos de este material, que ven viable su sustitución, les preguntaría: ¿cómo gestionarían un hospital sin plástico? ¿De qué material fabricarían los guantes, tubos, jeringas o las bolsas de sangre y suero? ¿Qué implicaciones tendría la ausencia de plásticos en la seguridad y la higiene en los hospitales?

Es lícito y realista plantear, eso sí, que el plástico de un solo uso se utiliza en exceso en los centros sanitarios. Por ejemplo, un estudio en un hospital del Reino Unido demostró que una simple operación de amigdalitis generaba más de un centenar de piezas separadas de residuos de plástico.

En este momento, el plástico es esencial e insustituible en medicina; sin él se perderían muchas vidas.

¿Irremplazable o usado en exceso?

No solo el sector médico depende del plástico, otros también demandarían soluciones si este se eliminara. Desde el sector alimentario al de servicios o el tecnológico, plantearían preguntas tan básicas como: ¿podríamos mantener el ritmo frenético de crecimiento de dispositivos electrónicos en la sociedad actual? ¿Qué sería de las nuevas tecnologías?

La afirmación de que, sin plástico, nuestro sistema alimentario se desmoronaría es arriesgada, pero bastante realista. ¿Qué tipo de envases tendríamos? ¿Podríamos mantener igual de frescos y seguros los alimentos? ¿Podríamos garantizar el abastecimiento de comida a todos los confines del planeta?

¿Podemos entonces convivir sin plásticos? La respuesta es no, pero esto no es óbice para observar cómo el crecimiento desmesurado e insostenible de su utilización plantea un problema de difícil solución: la contaminación plástica.

La nanotecnología entra en acción

Si queremos tener un equilibrio respecto al uso responsable de los plásticos, debemos repensar fundamentalmente la forma en que los fabricamos, usamos y reutilizamos, para que no se conviertan únicamente en residuos sin uso. La economía circular puede ser un enfoque interesante para lograr este objetivo.

Es aquí donde entran en juego nuevos avances, como la nanotecnología diseñada para detectar cambios microbianos o bioquímicos en los alimentos. En este contexto, diversos equipos de investigación trabajan en “embalajes inteligentes”, que nos proporcionará información sobre el producto que contiene.

También será clave en el futuro la mejora de las técnicas de reciclaje y la apuesta decidida por el reciclaje químico, donde los residuos poliméricos cambian su estructura química para ser utilizados como materia prima en la fabricación de nuevos plásticos. Un enfoque totalmente ajustado a la economía circular. A pesar de sus beneficios, aquí todavía debemos superar ciertas barreras, como las energéticas y las de rendimiento en comparación con el reciclado mecánico.

Cambio de hábitos

Un mundo sin plásticos no es posible, pero un mundo con el actual consumo de ellos, tampoco. Así, una llamada a la acción para poner fin a nuestra dependencia de los plásticos debe ir acompañada de pasos claros y tangibles, con una comprensión de las implicaciones de nuestras elecciones.

Si queremos transitar hacia una economía circular, solo queda alejarnos del modelo actual de “consumir, fabricar, desechar”. Debemos rediseñar productos para que sean más duraderos, reutilizables, reparables y reciclables. ¿Estamos preparados para cambiar nuestros hábitos? La respuesta a esta pregunta marcará nuestro futuro con o sin plásticos.

The Conversation

Jordi Diaz Marcos no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.

ref. No existe una varita mágica para eliminar todo el plástico del planeta – https://theconversation.com/no-existe-una-varita-magica-para-eliminar-todo-el-plastico-del-planeta-260530

El nacionalismo vuelve con fuerza: del siglo XIX al MAGA de Trump

Source: The Conversation – (in Spanish) – By Manuel Torres Aguilar, Catedrático de Historia del Derecho y de las Instituciones y director de la Cátedra UNESCO de Resolución de Conflictos, Universidad de Córdoba

Un seguidor de Trump lleva la gorra con el lema MAGA. Roschetzky Photography/Shutterstock

La ideología nacionalista nacida a finales del siglo XIX marcó buena parte de la historia desde entonces hasta casi la primera mitad del XX. Ahora está llamando de nuevo a nuestro presente.

Algunos ejemplos actuales revisten elementos compartidos con esa vieja ideología. Uno se viene arrastrando desde hace mucho tiempo: la reivindicación de China sobre Taiwán. Otro lo tenemos a las puertas de Europa desde hace menos tiempo: la aspiración de Rusia a ser la Gran Rusia, con todo lo que ello conlleva.

Los grandes fastos de EE. UU en 2026

Ahora se une a este florecer del nacionalismo excluyente el MAGA –Make America Great Again, “Haz América Grande Otra Vez” en español– de Donald Trump, que va a encontrar en la celebración del 250 aniversario de la Declaración de Independencia de los Estados Unidos de Norteamérica, en 2026, su formulación más integrista y provocadora: la política antiinmigración y el renacido y furibundo nacionalismo MAGA.

Por su parte, en Europa los partidos de ultraderecha tratan de imitar el modelo, reivindicando el sentimiento nacional por encima de cualquier propuesta de multilateralismo o integración y tratando de atacar los cimientos de la Unión Europea con argumentos de lo más pintoresco.

De este modo hacen el trabajo sucio, cual caballo de Troya, al proyecto de Trump de debilitar y, si es posible, romper la Pax Europaea que se construyó sobre las cenizas de más de 55 millones de muertos.

Algunas de las actividades previstas para las celebraciones del próximo año en Norteamérica recuerdan mucho a los desfiles de nibelungos, deidades y mitos pangermánicos, exaltaciones de la pureza aria y demás que llenaron las calles de Alemania en los años previos a la II Guerra Mundial.

EE. UU. va a tener en 2028, una vez más, hasta unos Juegos Olímpicos –en Los Ángeles–, por si faltase algún ingrediente.

No es cuestión baladí el mensaje que quiere transmitirse desde el gobierno trumpista. El proyecto de reescritura de la historia norteamericana que están llevando a cabo –bajo la premisa de que hay que eliminar todas las mentiras introducidas por la cultura de la “izquierda radical”– cumple su deseo de hacer una historia sin historiadores. En ese sentido, se trabaja en una historia que modifique contenidos, museos, archivos y bibliografía, si es preciso, para exaltar el valor de lo propio frente a cualquier otro elemento integrador.

Raíces del nacionalismo

El nacionalismo tal y como lo conocemos no siempre ha acompañado a la humanidad. Es una ideología que apareció entre los años 1880-1914, aunque tiene sus atencedentes tras la Revolución Francesa. El término comenzó a usarse en Francia, Italia y Alemania para definir a los grupos ideológicos de derecha extrema que utilizaban las apelaciones a la patria frente a extranjeros, liberales y socialistas.

La base del nacionalismo es la voluntad de identificar emocionalmente al individuo con los elementos identitarios de su nación, frente a otros a los que considera inferiores, de modo que quien no comparta estos principios sea simplemente un traidor a la patria.

De otra parte, en cuanto a las conmemoraciones, celebraciones, centenarios y demás, no es algo que hunda sus raíces en la memoria de los tiempos. Es un invento también ligado al nacionalismo.

Baste recordar que, por mucho que busque cualquier investigador, no encontrará ninguna referencia al III Centenario del descubrimiento de América, y menos al segundo o al primero. Sencillamente, era algo que no existía en la conciencia política. Los reyes solo celebraban sus cumpleaños, onomásticas y algún que otro asunto religioso. La primera vez que se conmemoró tal efeméride fue en 1892, con motivo de los 400 años del viaje de Cristóbal Colón.

“Patria” y “España” no siempre fueron sinónimos

Con el ánimo de seguir relativizando la fortaleza de estos conceptos, es preciso recordar que en español la palabra “patria” no fue sinónimo de España hasta el siglo XIX, con anterioridad el término se refería a la localidad de nacimiento.

Algo parecido pasa con el italiano y la palabra “paese”. Cuando las comunidades locales tales como la aldea, el pueblo, la comarca iban debilitándose, la patria se convirtió en la metáfora para integrar al imaginario de la nación. La escuela, los nuevos medios de comunicación de masas, incluso la religión, fueron contribuyendo a crear y fortalecer el concepto de la comunidad, del yo frente a la comunidad del otro.

Es paradigmático el caso del Imperio austrohúngaro, en el que la conciencia de pertenencia a una nación no fue incompatible con la idea del apoyo a la monarquía habsbúrgica. Claro está, hasta que saltó por las aires tras la caída del Imperio y las naciones centroeuropeas acrecentaron su idea nacional, que desembocaría en las guerras europeas del siglo XX.

A la altura de 1914 ya no era la gloria individual o la conquista lo que inspiraba a los beligerantes, sino la idea de la amenaza al nosotros, de la agresión del ellos contra nuestra libertad y nuestra civilización. No es casual que la xenofobia encontrara también en este momento su mejor caldo de cultivo. Nuestra victoria ya no era la de nuestra gloria, sino la de la patria.

Solo basta recordar que después de la Gran Guerra, como la patria ya eran todos, se abandonó la idea del campo de batalla y nació el concepto de guerra total que encontraría su cenit en la II Guerra Mundial. En ella, los muertos eran todos, no solo los soldados, y las bombas iban contra todos: niños, mujeres, ancianos, civiles en general. Exactamente como hoy se hacen las guerras, atacando más a las retaguardias, a los civiles, y haciendo a toda la comunidad objetivo militar.

Desgracias en nombre de la nación

La nación, la patria, ha sido desde finales del siglo XIX el origen de las mayores desgracias de la humanidad. En cambio, los proyectos colectivos y de integración –Naciones Unidas, Unesco, Organización Mundial de la Salud, FAO y, por supuesto, la Unión Europea– han traído las épocas de mayor prosperidad y solidaridad mundial o regional y paz.

Advertidos estamos porque tenemos cerca los dos modelos. Después de Napoleón, hasta el nacimiento de los nacionalismos, Europa había vivido en paz. Luego acabó todo. El renovado protagonismo de esta ideología nada bueno puede augurar.

The Conversation

Manuel Torres Aguilar no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.

ref. El nacionalismo vuelve con fuerza: del siglo XIX al MAGA de Trump – https://theconversation.com/el-nacionalismo-vuelve-con-fuerza-del-siglo-xix-al-maga-de-trump-261333

¿Funcionan las mantas refrescantes virales en TikTok?

Source: The Conversation – (in Spanish) – By María Dolores Martín Alonso, Materials Science PhD, IMDEA MATERIALES

nito/Shutterstock

TikTok tiene la asombrosa capacidad de convertir objetos cotidianos en milagros virales. Un día es una crema que borra las arrugas en segundos, al siguiente, una manta que promete noches frescas sin aire acondicionado. Las “mantas refrescantes” son el nuevo fetiche del algoritmo: vídeos con millones de visualizaciones muestran a influencers envolviéndose en tejidos que, según ellos, “absorben el calor corporal”.

Uno de los vídeos más comentados es el del portal SlashGear, que realizó una prueba práctica con una de las mantas virales más vendidas. Dejaron esa manta y una convencional al sol. ¿El resultado? La manta “refrescante” mostraba hasta 6 °C menos en su superficie exterior. A simple vista, parece una victoria… pero la física, como suele ocurrir, pide una segunda opinión.

La bajada de temperatura con la manta

¿Por qué muestra esa bajada en la temperatura si ni siquiera hay contacto humano? La clave está en cómo cada tejido absorbe, refleja o disipa el calor del entorno. Algunos materiales sintéticos, como el nailon o el polietileno modificado, reflejan más la radiación solar o se calientan menos al sol, lo que puede explicar esa diferencia superficial.

Esa diferencia, sin embargo, no garantiza automáticamente una sensación de frescor duradera cuando entramos en contacto con la manta. La sensación de alivio térmico al contacto se debe principalmente a la conductividad térmica. Algunos tejidos, como los mencionados nailon o polietileno, transfieren el calor de nuestra piel de forma más eficiente que otros, como el algodón. Es el mismo principio por el que un pasamanos metálico se percibe mucho más caliente al sol que uno de madera, aunque ambos estén expuestos a las mismas condiciones.

Por eso, muchas personas que prueban estas mantas en una habitación templada dicen que “sí, se nota más fresca”, al menos al principio. Pero esa sensación inicial no dura. En foros como Reddit es fácil encontrar experiencias que contrastan con la euforia inicial. “Los primeros diez minutos, genial. Luego fue como envolverme en papel film.”

El efecto desaparece al romper el equilibrio térmico

Lo que ocurre es que, tras absorber nuestro calor corporal, el tejido alcanza rápidamente el equilibrio térmico. Si ese calor no se disipa, por ejemplo, si estamos tumbados sin ventilación o hace mucho calor ambiental, la manta deja de ser fresca. Es decir, si no hay un mecanismo que mantenga el gradiente térmico, el efecto desaparece.

Sin embargo, hay mantas que sí logran mantener ese gradiente durante más tiempo. Lo hacen gracias a materiales específicamente diseñados para ello. Y ahí es donde entra en juego la física.

La física básica: un cambio de fase

El truco no está en el tejido, ni en la textura, ni en una fórmula secreta al estilo Coca-Cola. Está en un principio básico de la física térmica: el cambio de fase.

Cuando un material cambia de estado (por ejemplo, de sólido a líquido), necesita absorber una gran cantidad de energía sin aumentar su temperatura. Esa energía se llama calor latente de fusión. El ejemplo más cotidiano es el hielo: puede absorber mucho calor al derretirse, pero permanece a 0 °C hasta que se ha convertido en agua por completo.

En el caso de las mantas realmente refrescantes, se utilizan materiales llamados PCM (Phase Change Materials por sus siglas en inglés), diseñados para fundirse a temperaturas cercanas al confort térmico humano, entre 18 y 21 °C. Durante ese cambio de estado, absorben el calor del cuerpo sin calentarse hasta que todo el PCM haya fundido, lo que permite mantener una sensación de frescor durante más tiempo.

Imaginemos que nos tapamos con una manta llena de “cubitos invisibles” que se derriten justo a la temperatura ideal. Mientras lo hacen, se “beben” parte del calor que generamos al dormir. Esa es, literalmente, la esencia de una manta con PCM. Y lo mejor es que, una vez que el material ha terminado de fundirse, se puede “recargar” dejándola en un lugar fresco para que se solidifique de nuevo.

La manta del futuro

Para conseguir estas mantas realmente refrescantes necesitamos echar mano de la ciencia de materiales. No todos los sólidos se funden a temperaturas útiles para el confort humano que, al mismo tiempo, absorban una cantidad significativa de calor. Los PCMs más comunes se dividen en tres grandes grupos: orgánicos, inorgánicos y eutécticos.

Los PCMs orgánicos, como las parafinas, son populares por su estabilidad y bajo coste. Están compuestos por largas cadenas hidrocarbonadas que, al fundirse, absorben calor y se mantienen estables durante muchos ciclos térmicos. Su temperatura de fusión puede ajustarse eligiendo el número de átomos de carbono.

En el contexto de las mantas, estos PCMs se encapsulan en microestructuras, normalmente cápsulas poliméricas, que les permiten pasar de sólido a líquido sin escapar ni dañar el textil. El encapsulado protege al material de la degradación y permite que la manta soporte muchos ciclos sin perder eficacia.

¿Y están ya en el mercado o aún son ciencia de laboratorio?

Los cubitos invisibles ya son una realidad

Aunque hablar de “cubitos invisibles” suene a ciencia ficción, los materiales de cambio de fase ya están presentes en productos reales, no solo en mantas, sino también en ropa deportiva, calzado técnico o arquitectura bioclimática.

En el sector textil, varias marcas han comenzado a comercializar tejidos que incorporan microcápsulas de PCM. Una de las más conocidas es Outlast Technologies, que surgió a partir de colaboraciones con la NASA y aplica estas tecnologías en ropa térmica, sábanas o chaquetas.

Mientras tanto, la investigación sigue avanzando. Las líneas más activas se centran en mejorar la estabilidad a largo plazo, aumentar la conductividad térmica y desarrollar materiales más sostenibles y con el mayor calor latente de fusión por masa posible. El reto ya no es demostrar que funcionan, sino lograr que lo hagan de forma fiable, asequible y cómoda.

Como muchas modas virales, las mantas refrescantes tienen un pie en la realidad y otro en la exageración. Algunas sí funcionan, pero no por arte de magia ni por una “fórmula secreta guardada en un sobre lacrado”, sino gracias a principios bien conocidos de la física y la ingeniería de materiales. Y aunque su efecto no sea eterno ni milagroso, tal vez sea suficiente para pasar una noche de verano sin sudar la gota gorda.

The Conversation

María Dolores Martín Alonso no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.

ref. ¿Funcionan las mantas refrescantes virales en TikTok? – https://theconversation.com/funcionan-las-mantas-refrescantes-virales-en-tiktok-261779

By firing the Bureau of Labor Statistics chief, the Trump administration raises concerns that it may further restrict the flow of essential government information

Source: The Conversation – USA (2) – By Sarah James, Assistant Professor of Political Science, Gonzaga University

Do government programs work? It’s impossible to find out with no data. Andranik Hakobyan/iStock via Getty Images Plus

President Donald Trump’s firing of Bureau of Labor Statistics Commissioner Erika McEntarfer on Aug. 1, 2025, after an unfavorable unemployment report has been drawing criticism for its potential to undercut the agency’s credibility. But it’s not the first time that his administration has taken steps that could weaken the integrity of some government data.

Consider the tracking of U.S. maternal mortality, which is the highest among developed nations. Since 1987, the Centers for Disease Control and Prevention has administered the Pregnancy Risk Assessment Monitoring System to better understand when, where and why maternal deaths occur.

In April 2025, the Trump administration put the department in charge of collecting and tracking this data on leave.

So far, there are no indications that any BLS data has been deleted or disrupted. But there have been reports of that occurring in other agencies of all kinds.

The White House is also collecting less information about everything from how many Americans have health insurance to the number of students enrolled in public schools, and making government-curated data of all kinds off-limits to the public. President Donald Trump is also trying to get rid of entire agencies, like the Department of Education, that are responsible for collecting important data tied to poverty and inequality.

His administration has also begun deleting websites and respositories that share government data with the public.

Why data is essential for the safety net

I study the role that data plays in political decision-making, including when and how government officials decide to collect it. Through years of research, I’ve found that good data is essential – not just for politicians, but for journalists, advocates and voters. Without it, it’s much harder to figure out when a policy is failing, and even more difficult to help people who aren’t politically well connected.

Since Trump was sworn in for a second time, I have been keeping an eye on the disruption, removal and defunding of data on safety net programs such as food assistance and services for people with disabilities.

I believe that disrupting data collection will make it harder to figure out who qualifies for these programs, or what happens when people lose their benefits. I also think that all this missing data will make it harder for supporters of safety net programs to rebuild them in the future.

Why the government collects this data

There’s no way to find out whether policies and programs are working without credible data collected over a long period of time.

For example, without a system to accurately measure how many people need help putting food on their tables, it’s hard to figure out how much the country should spend on the Supplemental Nutrition Assistant Program, formerly known as food stamps, the federal supplemental nutrition program for women, infants and children, known as WIC, and related programs. Data on Medicaid eligibility and enrollment before and after the passage of the Affordable Care Act in 2010 offers another example. National data showed that millions of Americans gained health insurance coverage after the ACA was rolled out.

Many institutions and organizations, such as universities, news organizations, think tanks, and nonprofits focused on particular issues like poverty and inequality or housing, collect data on the impact of safety net policies on low-income Americans.

No doubt these nongovernmental data collection efforts will continue, and maybe even increase. However, it’s highly unlikely that these independent efforts can replace any of the government’s data collection programs – let alone all of them.

The government, because it takes the lead in implementing official policies, is in a unique position to collect and store sensitive data collected over long periods of time. That’s why the disappearance of thousands of official websites can have very long-term consequences.

What makes Trump’s approach stand out

The Trump administration’s pausing, defunding and suppressing of government data marks a big departure from his predecessors.

As early as the 1930s, U.S. social scientists and local policymakers realized the potential for data to show which policies were working and which were a waste of money. Since then, policymakers across the political spectrum have grown increasingly interested in using data to make government work better.

This focus on data grew starting in 2001, when President George W. Bush made holding government accountable to measurable outcomes a top priority.

He saw data as a powerful tool for reducing waste and assessing policy outcomes. His signature education reform, the No Child Left Behind Act, radically expanded the collection and reporting of student achievement data at K-12 public schools.

George W. Bush speaks against a school locker backdrop, next to an American flag and another flag. Above him are the words 'Strengthening our schools.'
President George W. Bush speaks about education in 2005 at a high school in Falls Church, Va., outlining his plans for the No Child Left Behind Act.
Alex Wong/Getty Images)

How this contrasts with the Obama and Biden administrations

Presidents Barack Obama and Joe Biden emphasized the importance of data for evaluating the impact of their policies on low-income people, who have historically had little political clout.

Obama initiated a working group to identify ways to collect, analyze and incorporate more useful data into safety net policies. Biden implemented several of the group’s suggestions.

For example, he insisted on the collection of demographic data and its analysis when assessing the impacts of new safety net policies. This approach shaped how his administration handled changes in home loan practices, the expansion of broadband access and the establishment of outreach programs for enrolling people in Medicaid and Medicare.

Why rebuilding will be hard

It’s harder to make a case for safety net programs when you don’t have relevant data. For example, programs that help low-income people see a doctor, get fresh food and find housing can be more cost-effective than simply having them continue to live in poverty.

Blocking data collection may also make restoring government funding after a program gets cut or shut down even more challenging. That’s because it will be more challenging for people who in the past benefited from these programs to persuade their fellow taxpayers that there is a need for investing in a expanding program or creating a new one.

Without enough data, even well-intended policies in the future may worsen the very problems they’re meant to fix, long after the Trump administration has concluded.

This article was updated on Aug. 4, 2025, with the BLS news.

The Conversation

Sarah James does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. By firing the Bureau of Labor Statistics chief, the Trump administration raises concerns that it may further restrict the flow of essential government information – https://theconversation.com/by-firing-the-bureau-of-labor-statistics-chief-the-trump-administration-raises-concerns-that-it-may-further-restrict-the-flow-of-essential-government-information-259760

Will the new James Bond embrace hi-tech gadgets in an age of AI? The films have a complicated history with technology

Source: The Conversation – UK – By Christopher Holliday, Senior Lecturer in Liberal Arts and Visual Cultures Education, Department of Interdisciplinary Humanities, King’s College London

Development of a new James Bond film is underway at Amazon Studios, with the creater of Peaky Blinders, Steven Knight, now attached to write the screenplay, which will be directed by Denis Villeneuve.

The pair have given little away about what to expect from Bond 26. Knight said he wanted to do something “the same but different”, while Villeneuve said he would “honour the tradition” of the franchise. But a look back at how the films have dealt with key elements of Bond shows that following tradition can mean going in many different ways.

Take Bond’s toolbag of gadgets, which have been a part of the James Bond movies since their debut in the 1960s. Over the decades, the films have both leaned into and shifted away from the allure of hi-tech gadgetry in ways that plot key turning points in the franchise.

These peaks and troughs reflect what’s going on in the wider world as well as factors such as the influence of other successful film franchises. So with AI on the minds of many right now, the new film could embrace contemporary themes of technology. But re-booting the franchise when a new lead actor is cast is also often associated with a grittier or “back to basics” approach.

The first few Bond films starring Sean Connery, including Dr No (1962), From Russia With Love, and Goldfinger (1964) feature a smattering of spy technology. But by You Only Live Twice (1967), producers had opted for a space capsule hijack narrative – reflecting the influence of the US-Soviet space race – and a villain’s lair in a hollowed-out volcano.

However, the next entry – On Her Majesty’s Secret Service (1969) centred largely on the emotional realism of Bond’s (George Lazenby) courtship and subsequent marriage to Tracy di Vicenzo (Diana Rigg). The lesser focus on technology coincided with a new Bond actor – a pattern to be frequently repeated later on in the franchise. But for other reasons, the shift in tone was, perhaps, to be expected.

Goldfinger: Q introduces Bond to his Aston Martin.

Bond author Ian Fleming was writing On Her Majesty’s Secret Service at his holiday home – Goldeneye – in Jamaica, while Dr No was being filmed nearby. The book was published on April 1, 1963, the day From Russia With Love began filming (the film was released in October that year). The less gadget-focused approach of On Her Majesty’s Secret Service could be seen as a possible jab by Fleming at what he saw as the cinematic Bond’s growing overreliance on the latest tech.

Journeying back through the franchise, it is not hard to find instances where moments of technological excess are countered almost immediately by a more pared down, character-centred set of priorities.

After On Her Majesty’s Secret Service, Connery returned for one further Eon Productions film, Diamonds Are Forever (1971), which, like You Only Live Twice, featured a space-themed narrative. Live And Let Die (1973), Roger Moore’s debut as Bond, is somewhat more down to Earth and was the first film not to feature Bond’s gadgetmaster Q (who is referred to as Major Boothroyd in Dr No).

But a growing reliance on technology can be seen during the 70s Moore films, culminating with Moonraker (1979) – which was heavily influenced by Star Wars (1977) – in which Bond goes into space.

Moore’s follow-up, For Your Eyes Only (1981), was – as that film’s director John Glen noted – a film that went “back to the grass roots of Bond.” The global economic recession that took place between 1980 and 1982 certainly helped support this shift in tone.

For Your Eyes Only had a lower budget than Moonraker, so the filmmakers had to act in a similar way to their leading character, who made innovative use in the film of his shoelaces to climb up a rope on a sheer rock face in Greece.

The last few Roger Moore films have examples of Bond’s complex connection to technology, such as the computer microchip narrative of Moore’s final film A View to a Kill. But the next film, The Living Daylights (1987), was a return to the grittier Bond of the novels – with a focus on classic spycraft. From an action-packed opening in Gibraltar, the narrative moves to Bratislava where Dalton helps a KGB General defect to the west.

When Dalton departed after Licence to Kill (1989), which shows the influence of big-budget 80s Hollywood action movies, the series’ return after a six-year hiatus brought Bond into the information age. The cyberterrorist narrative of GoldenEye (1995), Pierce Brosnan’s debut as Bond, is fully indebted to a broader curiosity surrounding emerging internet sub-cultures.

The Living Daylights opening scene (official 007 YouTube)

Brosnan’s final outing, Die Another Day (2002) featured an Aston Martin that could turn invisible, which critics and audiences dismissed as a series nadir. The post-9/11 climate of protector narratives in defence of national security featured an altogether grittier action cinema counting Jason Bourne as its most popular hero. Die Another Day’s invisible Aston Martin and the indelible image of a computer-generated Bond surfing amid digital icebergs did not quite align with this state of post-millennial geopolitics.

Enter Daniel Craig, and the franchise’s emphatic declaration that it was going to do things for real, per the title of a documentary on Craig’s debut Casino Royale (2006). This was a statement of intent, anchored not just to a reduction in computer-generated imagery (CGI) behind-the-scenes, but equally by a turn away from the kinds of excessive technological wizardry that defined earlier instalments.

The absence of Q from Craig’s debut Casino Royale (2006) for the first time since Live and Let Die appeared to confirm a more “back to basics” feel. When the character did finally appear in Craig’s third film Skyfall (2012), Q (now played by Ben Whishaw) remarks to Bond: “Were you expecting an exploding pen? We don’t really go in for that anymore.”

Die Another Day trailer.

With another reboot on the way, the question now is whether the new film will draw inspiration from real-world technologies and push once more at the limits of technical innovation. Perhaps Villeneuve will exploit his science-fiction credentials finetuned in Arrival (2016), Blade Runner 2049 (2017) and his successful Dune films (2021-2024).

But given how contemporary cultural landscape is awash with the threat of AI, maybe the franchise does need to beat a hasty retreat from technology in order to stand out. Either way the filmakers will be able to argue they are sticking to tradition.

The Conversation

Christopher Holliday does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Will the new James Bond embrace hi-tech gadgets in an age of AI? The films have a complicated history with technology – https://theconversation.com/will-the-new-james-bond-embrace-hi-tech-gadgets-in-an-age-of-ai-the-films-have-a-complicated-history-with-technology-262447

What we’ve learned in ten years about county lines drug dealing

Source: The Conversation – UK – By Jenna Carr, Graduate Teaching Fellow and Sociology PhD Researcher, University of Liverpool

ThomasDeco/Shutterstock

A decade ago, the National Crime Agency identified a new drug supply method. Before then, drug supply was predominantly between user-dealers – people supplying their social circles to fund their drug use, rather than for commercial gain.

In 2015, police outside of London identified a pattern of more frequent arrests of young people and vulnerable adults, implicated in drug supply outside of their local areas. They were also frequently suspected to be associated with members of criminal gangs. Thus, “county lines” was born.

The National Crime Agency used the term “county lines” to describe the phone or “deal” line used to organise the sale of drugs – mainly heroin and crack cocaine – from cities with oversaturated supplies, to rural, coastal areas with less supply.

The deal line was controlled by gang members based in the inner city area, such as London or Liverpool, known as “exporter” areas. The sale of drugs would be completed by a young or vulnerable person who had been exploited and sometimes trafficked out of their home areas to rural “importer” areas, such as north Wales and Cornwall. The crossing of local authority and police boundaries made county lines difficult to police, and to safeguard those who had been exploited.

County lines is notably violent. It involves gang violence, knife crime, drug misuse, sexual exploitation and modern-day slavery.

Ten years on, county lines as a supply model continues to evolve. A recent assessment by the National Police Chiefs’ Council found that the practice is becoming more localised, with fewer lines running between police force boundaries, and more running from one end of a force to the other end. It is also no longer limited to the supply of class A substances, with police reporting seizures of cannabis, cash and weapons.

Researchers are now suggesting that the term “county lines” itself is outdated, and instead should be replaced with a term that focuses more on the exploitation involved, rather than drug supply.

Who gets involved

County lines affects both children and vulnerable adults. The government has estimated 14,500 children to be at risk of child criminal exploitation, but this is likely to be an underestimation. Particular risk factors include being between 15 and 17 years old, experiences of neglect and abuse, economic vulnerability, school exclusion and frequent episodes of missing from home.

Cuckooing, where a gang will take over homes as a base for drug supply, largely affects vulnerable adults, rather than children.

One challenge in responding to county lines is that vulnerability can be difficult to recognise.
Victims and perpetrators of exploitation are often one and the same. Often, victims will be unwilling to cooperate with police, out of fear of legal consequences and repercussions from their exploiters.

Those who have been exploited into participating in county lines often do not accept that they are a victim – they may think they are profiting from their involvement, both financially and socially. The ongoing cost of living crisis draws young and vulnerable people into county lines as a response to poverty and lack of legitimate and financially viable opportunities.

Responding to county lines

My ongoing research looks at the development of county lines policy and responses to the problem over the last ten years. Responses to county lines have been mainly led by law enforcement, with coordinated police “crackdowns”. But research shows that high-profile police operations are largely symbolic, and have the effect of drawing vulnerable people into the criminal justice system, which creates further harm.

One important development has been the use of the Modern Slavery Act to prosecute county lines. The purpose of this is to offer a legal defence for someone who has been exploited into selling drugs. But research has shown, rather than acting as a safeguard and a defence, it acts as a “gateway into criminalisation”.

If someone crosses the boundary of being a victim to becoming a perpetrator of exploitation, they can also find themselves being subjected to punitive criminal justice responses under the Modern Slavery Act. This is especially true for black men and boys, who have historically been treated more harshly, for example through stop and search, in relation to drug crime.

It’s become clear that county lines is an issue that criminal justice alone cannot respond to. Those who are at risk require safeguarding, not criminalisation. To this end, the government funds a specialist county lines victim support service that operates in the four main exporter locations.

But the availability of this support service only in exporter locations shows that the county lines response is a postcode lottery. Police forces in importer areas have fewer resources to dedicate to training officers to deal with complex county lines cases. A consistent national approach is still required.

What’s next?

The current government is planning to make child criminal exploitation and cuckooing specific criminal offences through new legislation. This has been celebrated as a success by child safety charities.

But should more criminalisation be the priority? Research shows that drug prohibition and punitive responses are ineffective at preventing young people and vulnerable adults becoming involved in county lines. The demand for drugs and structural issues such as poverty are fuelling county lines – policing alone cannot address this.

Instead of punitive legal responses, public health and addressing the demand for drugs should be priority. Investment is needed in support services and social care, which have been decimated by austerity cuts, to build a society where vulnerable people do not need to become involved in drug supply.


Want more politics coverage from academic experts? Every week, we bring you informed analysis of developments in government and fact check the claims being made.

Sign up for our weekly politics newsletter, delivered every Friday.


The Conversation

Jenna Carr does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What we’ve learned in ten years about county lines drug dealing – https://theconversation.com/what-weve-learned-in-ten-years-about-county-lines-drug-dealing-261438

What the world can learn from Korea’s 15th-century rain gauge

Source: The Conversation – UK – By Mooyoung Han, Professor of Environmental Engineering, Seoul National University

The rain gauge with a statue of King Sejong the Great in Seoul, Korea. KoreaKHW/Shutterstock

Droughts and floods are becoming more frequent and more severe across the globe. The cause is often rain — either too little or too much. The monsoon regions of the world, where societies have weathered cycles of drought and deluge for thousands of years, hold essential lessons about rainwater monitoring and conservation.

In Korea, one such lesson dates back to the 15th century. In 1441, during the reign of King Sejong, Korea established the world’s first official rain gauge (cheugugi) — a cylindrical copper instrument — and also created a state-administered rain monitoring network.

This wasn’t just a technical invention; it was part of a wider policy. On September 3 of that year, according to the Annals of the Choson Dynasty (a Unesco Memory of the World record), local magistrates across the country were ordered to measure rainfall regularly and report it to the central government.

This system represented one of the earliest forms of climate data governance and set a precedent for valuing rain as a measurable, manageable and fairly governed resource — a public good to be shared and respected. It also reflected a philosophical tradition in Korea of respecting rain not as a curse, but as a gift — one that must be understood, welcomed and shared.

India too has a rich tradition of rainwater harvesting, spanning from the Vedic period and the Indus–Sarasvati Valley civilisation (3,000–1,500BC) to the 19th century. Throughout diverse ecological zones, Indian communities developed decentralised systems to capture and store rainwater. The archaeological site of Dholavira in Gujarat, for example, featured sophisticated reservoirs designed to collect monsoon runoff.

Historical records, including ancient inscriptions, temple documents and folk traditions, indicate that these systems were not only engineered but also governed, with established rules for sharing, maintaining and investing in water as a communal resource. In some regions of India, every third house had its own well. Although these practices declined during colonial rule, they are now being revived by local communities, government initiatives, and non-governmental organisations.

The revival of traditional wells is gaining momentum, particularly in urban areas facing water scarcity. For example in the city of Bengaluru in southern India, local communities and organisations are using age-old well-digging techniques to tap into shallow aquifers. These efforts are often supported by the state or central government, as well as specialists and organisations including the Biome Environmental Trust, Aga Khan Trust for Culture, Indian National Trust for Art and Cultural Heritage, and the Centre for Science and Environment.

India’s current prime minister has also launched a campaign called Jal Shakti Abhiyan: Catch the Rain as part of a nationwide effort to restore and promote community-led rainwater harvesting.

Reviving ancient wisdom

In Korea, there’s also been a resurgence of this ancient wisdom in modern contexts. Although urban initiatives like the Star City rainwater management system show promise, the movement towards reviving old practices like rainwater harvesting is still growing.

Meanwhile in Cambodia, the Rain School Initiative empowers students and teachers to manage rainwater for drinking and climate education. Rainwater is not just a technical solution — it is a cultural key to resilience. It offers autonomy, sustainability and hope.

That is why we propose to establish UN Rain Day on September 3, in recognition of Korea’s historical contribution and in celebration of global rain literacy. It is a symbolic date that reminds us how rain has shaped civilisations and how it can shape our future — if only we choose to listen to the wisdom of water.

Designating international days has proven effective in raising awareness and catalysing global action. For instance, World Water Day (March 22) has spurred international cooperation and policymaking on water issues since its establishment in 1993. World Toilet Day (November 19) has elevated the global conversation around sanitation and public health.

A UN Rain Day would spotlight rain as a vital yet often overlooked resource. This is something that’s especially crucial for climate adaptation in monsoon regions and beyond.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.


The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. What the world can learn from Korea’s 15th-century rain gauge – https://theconversation.com/what-the-world-can-learn-from-koreas-15th-century-rain-gauge-261530

How letting your mind wander can reset your brain

Source: The Conversation – UK – By Anna Kenyon, Senior Lecturer in Population Health, University of Lancashire

The brain needs time off, too. baranq/ Shutterstock

Every day, we’re faced with constant opportunities for stimulation. With 24/7 access to news feeds, emails and social media, many of us find ourselves scrolling endlessly, chasing our next hit of dopamine. But these habits are fuelling our stress – and our brains are begging for a break.

What our brains really need is some much needed time off from concentrating. By not consciously focusing on anything and allowing the mind to drift, this can reduce stress and improve cognitive sharpness.

This can often be easier said than done. But attention restoration theory (Art) can help you learn to give your brain space to drift. While this might sound like a fancy name for doing nothing, the theory is supported by neuroscience.

Attention restoration theory was first put forward by psychologists Rachel and Stephen Kaplan in 1989. They theorised that spending time in nature can help to restore focus and attention.

They proposed there are two distinct types of attention: directed attention and undirected attention. Directed attention refers to deliberate concentration – such as studying, navigating through a busy place or posting on social media. Basically, it’s any activity where our brain’s attention is being directed at a specific task.

Undirected attention is when we’re not consciously trying to focus on anything – instead allowing things to gently capture our attention without trying. Think listening to chirping birds or watching leaves gently rustling in the breeze. In these instances, your attention naturally drifts without having to force your focus.

Without time for undirected attention, it’s thought that we experience “attentional fatigue”. This can make it increasingly difficult to focus and concentrate, while distractions become more likely to grab our attention.

In the past, we encountered many situations in our daily lives that we might classify as “boring”. Moments such as waiting for the bus or standing in the supermarket queue. But these dull moments also gave our minds a chance to switch off.

Now, our smartphones give us the opportunity for constant entertainment. Being able to constantly expose ourselves to intense, gripping stimuli offers little mental space for our overworked brains to recover.

But attention restoration theory shows us how important it is to create space for moments that allow our brains to “reset”.

Restoring attention

The origins of Kaplan and Kaplan’s theory can actually be traced back to the 19th century. American psychologist William James was the first to formulate the concept of “voluntary attention” – attention that requires effort. James’ ideas were published against the backdrop of the broader cultural movement of Romanticism, which lauded nature.

Romantic ideas about the restorative power of nature have since been backed by research – with numerous studies showing links between time in nature and lower stress levels, better attention, improvements in mental health, mood and better cognitive function.

The restorative benefits of nature are backed by neuroscience, too. Neuroimaging has shown that activity in the amygdala – the part of the brain associated with stress and anxiety – was reduced when people were exposed to natural environments. But when exposed to urban environments, this activity was not reduced.

A young woman looks at her phone while waiting for the bus.
Many of us have grown used to filling every moment of our day with distraction.
Head over Heels/ Shutterstock

Numerous studies have also since backed up Kaplan and Kaplan’s theory that time in nature can help to restore attention and wellbeing. One systematic review of 42 studies found an association with exposure to natural environments and improvements in several aspects of cognitive performance – including attention.

A randomised controlled trial using neuroimaging of the brain found signs of lower stress levels in adults who took a 40-minute walk in a natural environment, compared to participants who walked in an urban environment. The authors concluded that the nature walk facilitated attention restoration.

Research has even shown that as little as ten minutes of undirected attention can result in a measurable uptick in performance on cognitive tests, as well as a reduction in attentional fatigue. Even simply walking on a treadmill while looking at a nature scene can produce this cognitive effect.

Time in nature

There are many ways you can put attention restoration theory to the test on your own. First, find any kind of green space – whether that’s your local park, a river you can sit beside or a forest trail you can hike along. Next, make sure you put your phone and any other distractions away.

Or, when you face boring moments during your day, instead of picking up your phone try seeing the pause as an opportunity to let your mind wander for a bit.

Each of us may find certain environments to be more naturally supportive in allowing us to switch off and disengage the mind. So if while trying to put attention restoration theory into practice you find your brain pulling you back to structured tasks (such as mentally planning your week), this may be sign you should go someplace where it’s easier for your mind to wander.

Whether you’re watching a ladybird crawl across your desk or visiting a vast expanse of nature, allow your attention to be undirected. It’s not laziness, it’s neurological maintenance.

The Conversation

Anna Kenyon has received research funding from the National Academy for Social Prescribing & Natural England, the University of Lancashire, West Yorkshire Health and Care Partnership and the Institute for Citizenship, Society & Change. She is an Associate member of the Faculty of Public Health.

ref. How letting your mind wander can reset your brain – https://theconversation.com/how-letting-your-mind-wander-can-reset-your-brain-259854

Five things I wish people knew about supplements – by a nutritionist

Source: The Conversation – UK – By Rachel Woods, Senior Lecturer in Physiology, University of Lincoln

Kaboompics.com, CC BY-SA

From collagen powders to immunity gummies, supplements are everywhere – in our Instagram feeds, on supermarket shelves and filling our bathroom cabinets. Promising better sleep, glowing skin, sharper focus or even a longer life, they’re marketed as quick fixes for modern health woes.

As a nutritionist, I’m often asked whether supplements are worth the money – and the answer is: it depends. Based on online claims, you might think they can cure almost anything.

While some supplements do have a valuable role in certain circumstances, they are often misunderstood and frequently oversold. Yet many people are unaware of the risks, the limitations and the marketing tricks behind the labels.

Here are five things I wish more people knew before buying supplements.

1. Start with food, not supplements

If you can get a nutrient from your diet, that is almost always the better option. The UK’s Food Standards Agency defines a food supplement as a product “intended to correct nutritional deficiencies, maintain an adequate intake of certain nutrients, or support specific physiological functions”. In other words, supplements are there to support your diet, not replace real foods.

Whole foods offer much more than isolated nutrients. For example, oily fish like salmon provides not just omega-3 fats, but also protein, vitamin D, selenium and other beneficial compounds. These interact in ways we don’t fully understand, and their combined effect is difficult, if not impossible, to replicate in supplement form.

Scientists have tried to isolate the “active ingredients” in fruit and vegetables to recreate their benefits in pills, but without success. The advantages seem to come from the complete food, not one compound.

That said, there are circumstances where supplements are necessary. For instance, folic acid is recommended before and during pregnancy to reduce the risk of neural tube defects in the foetus. Vitamin D is advised during winter months when sunlight is limited. People following a vegan diet may need vitamin B12, since it is mostly found in animal products.

2. You might not realise you’re taking too much

It is far easier to take too much of a supplement than it is to overdo it with food. In the short term, this might lead to side effects such as nausea or diarrhoea. But long-term overuse can have serious consequences.

Many people take supplements for years without knowing whether they need them or how much is too much. Fat-soluble vitamins like A, D, E, and K are stored in the body rather than excreted. Too much vitamin D, for example, can lead to a build-up of calcium, which may damage the kidneys and heart, as well as weakening bones. High doses of vitamin A can cause liver damage, birth defects in pregnancy, and decreased bone density.

Even water-soluble vitamins can cause problems, with long-term overuse of vitamin B6 being linked to nerve damage.

Since most people don’t regularly check their blood nutrient levels, they often don’t realise something is wrong until symptoms appear.

3. Don’t trust social media advice

Spend a few minutes online and you will probably see supplements promoted as “immune-boosting”, “natural”, or “detoxifying”. These words can sound convincing, but they have no scientific definition. They are marketing terms.

The Food Standards Agency is clear that supplements “are not medicinal products” and “cannot exert a pharmacological, immunological or metabolic action”. Yet many online claims suggest otherwise. This kind of marketing, sometimes called “healthwashing”, gives the impression that supplements have powers they do not. Supplements are not subject to the same testing and regulation as medicines. This means they can be poorly formulated, wrongly dosed, or mislabelled.

The Advertising Standards Authority (ASA) has rules about how health claims can be made, including on social media. But enforcement is difficult, especially with influencer marketing and affiliate schemes. Multi-level marketing (MLM) schemes add further complexity. Sellers, often with no medical or scientific training, promote products using personal anecdotes rather than evidence. While the ASA provides specific guidance on how MLM sellers can advertise supplements, these rules are frequently ignored, are rarely enforced and often slip through regulatory gaps, meaning there are some truly astonishing claims being made.

4. The supplement industry is more about sales than science

The global supplement market is worth over £100 billion. Like any major industry, its goal is growth and profit. This influences how products are developed and marketed.
If a supplement truly worked, it would be recommended by doctors, not influencers.

Some supplements are supported by evidence, but they tend to be the less eye-catching ones, such as iron or vitamin D. Many others are advertised with claims that stretch far beyond what the research shows and are often promoted by people with no formal training in nutrition or healthcare.

5. Some supplements aren’t safe for everyone

Being available over the counter does not mean a supplement is safe. Even products labelled as “natural” can interact with medicines or cause harm.

St John’s Wort, sometimes used for low mood, can have dangerous side effects if taken alongside some antidepressants, birth control and blood pressure medications. Vitamin K can interfere with blood thinners like warfarin. High-dose iron can cause digestive problems and affect how some antibiotics are absorbed.

Many supplements haven’t been tested for safety in pregnant people. Others, like high-dose vitamin A, are known to be harmful in pregnancy and can pass through breast milk. If you’re pregnant, breastfeeding, taking medication or managing a health condition, speak to a pharmacist, GP or dietitian before starting a new supplement.

Supplements can support health when there is a specific need, but they are not a cure-all. Before spending money on a product with big promises, ask yourself: do I really need this, or would I be better off spending the money on nutritious food?

The Conversation

Rachel Woods does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Five things I wish people knew about supplements – by a nutritionist – https://theconversation.com/five-things-i-wish-people-knew-about-supplements-by-a-nutritionist-262099