Former French President Nicolas Sarkozy sentenced to five years in prison: Republic’s judiciary frees itself

Source: The Conversation – France – By Vincent Sizaire, Maître de conférence associé, membre du centre de droit pénal et de criminologie, Université Paris Nanterre – Université Paris Lumières

Former French President Nicolas Sarkozy has been found guilty of criminal conspiracy in a case related to the Libyan funding of his 2007 presidential campaign. Sentenced to five years in prison, he is due to appear in court on 13 October to learn the date of his incarceration. The unprecedented ruling marks a turning point in the practices of the French justice, which has gradually freed itself from political power. It also enshrines the Republican principle of full and complete equality of citizens before the law, which was proclaimed in 1789 but long remained theoretical.

Nicolas Sarkozy has been found guilty of criminal conspiracy by the Paris criminal court on Thursday 25 September, following the transfer of millions of euros of illicit funds from the late Libyan leader Col Muammar Gaddafi to finance his 2007 election campaign. As might be expected, the decision promptly drew anger from a large part of the political class.

It’s perfectly legitimate to argue against the ruling on the grounds that is unfair and unfounded. This applies first and foremost to the defendants, who have every right to appeal the judgement. However, the context in which these outcries take place is a political tinderbox: indeed, in April, the leader of the far-right National Rally, Marine Le Pen, was already sentenced to a five-year ban on running for public office after she was found guilty of helping to embezzle €2.9m (£2.5m) of EU funds for use by her party. Following on its heels, Sarkozy’s latest sentence provides yet another opportunity for a large section of the ruling classes to stir controversy over what the French describe as the “government of judges” and others would dub “juristocracy”.

Sarkozy will soon be the first post-war president of France to be imprisoned

Admittedly, the sentence may seem particularly severe: a €100,000 fine, five years of ineligibility and, above all, five years’ imprisonment with a deferred warrant of arrest which, combined with provisional enforcement, forces the convicted person to begin serving their prison sentence even if they appeal.

But if we take a closer look at the offences at play, the penalties hardly appear disproportionate. The facts are undeniably serious: organising the secret financing of an election campaign with funds from a corrupt and authoritarian regime, Libya – whose responsibility for an attack on an airplane that killed more than 50 French nationals has been recognised by the courts – in return for championing it on the international stage.

Given the maximum sentence is ten years in prison, the penalty can hardly be considered as too harsh. But what is being contested is the very principle of the conviction of a political leader by the courts, which is seen and presented as an intolerable attack on the institutional balance.

If we take the time to put this into historical perspective, however, we see that the judgments handed down in recent years against members of the ruling class are, in fact, part of a movement to liberate the judiciary from other powers, particularly the executive. This emancipation finally allows the judiciary to fully enforce the requirements of the republican legal system.

Equality of citizens before the law, a republican principle

It should be remembered that the revolutionary principle proclaimed on the night of 4 to 5 August 1789 was that of full and complete equality before the law, leading to the corresponding disappearance of all special laws – ‘privileges’ in the legal sense of the term – enjoyed by the nobility and the high clergy. The Penal Code of 1791 went even further: not only could those in power be held accountable before the same courts as other citizens, but they also faced harsher penalties for certain offences, particularly those involving corruption.

The principles on which the republican legal system is based could not be clearer: in a democratic society, where every person has the right to demand not only the full enjoyment of their rights, but also, more generally, the application of the law, no one can claim to benefit from a regime of exception – least of all elected officials. It is because we are confident that their illegal actions will be effectively punished, in the same way as other citizens and without waiting for a highly hypothetical electoral sanction, that they can truly call themselves our representatives.

When the Law Favored the Powerful

For a long time, however, this requirement for legal equality remained largely theoretical. Taken over and placed in a more or less explicit relationship of subordination to the government during the First Empire (1804-1814), the judiciary remained under the influence of the executive at least until the middle of the 20th century. This is why, until the end of the last century, the principle of equality before the law came up against a singular privilege of ‘notability’ which, except in exceptional situations or particularly serious and highly publicised cases, guaranteed relative impunity for members of the ruling classes whose criminal responsibility was called into question.

The situation only began to change following the humanist awakening of the liberation in 1940s. From 1958, magistrates were recruited by open competition and benefited from a relatively shielded status, as well as a dedicated school, the National School for the Judiciary. The latter gradually took up a demanding code of ethics, encouraged in particular by the recognition of judicial trade unionism in 1972. A new generation of judges emerged, who now took their mission seriously: to ensure, in complete independence, that the law was properly enforced, regardless of the background of those in the dock.

Bernard Tapie, Jacques Chirac, Nicolas Sarkozy…

It was in this context that something that had been unthinkable a few decades earlier came to pass: the prosecution and conviction of prominent figures on the same basis as the rest of the population. From the mid-1970s, the movement gained momentum in the following decades with the conviction of major business leaders, such as Adidas and football tycoon Bernard Tapie, and then national political figures, such as former conservative minister, Alain Carignon, or the Lyon mayor and deputy, Michel Noir. The conviction of former presidents of the Republic from the 2010s onwards – Jacques Chirac in 2011, Nicolas Sarkozy for the first time in 2021 – completed the normalisation of this trend or, rather, put an end to the democratic anomaly of giving preferential treatment to elected officials and, more broadly, to the ruling classes.

This movement, which initially stemmed from changes in judicial practices, was also supported by certain changes to French law. One example is the constitutional revision of February 2007, which enshrines the jurisprudence of the Constitutional Council according to which the President of the Republic cannot be subject to criminal prosecution during his term of office, but which allows proceedings to be resumed as soon as he leaves office. We can also mention the creation, in December 2013, of the National Financial Prosecutor’s Office, which, although it does not enjoy statutory independence from the executive branch, has been able to demonstrate its de facto independence in recent years.

Any talk of “judicial tyranny” is intended to take aim at this historical development. This rhetoric seeks less to defend the sovereignty of the people than that of the oligarchic rulers.

The Conversation

Vincent Sizaire ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.

ref. Former French President Nicolas Sarkozy sentenced to five years in prison: Republic’s judiciary frees itself – https://theconversation.com/former-french-president-nicolas-sarkozy-sentenced-to-five-years-in-prison-republics-judiciary-frees-itself-266170

Les sanctions économiques doivent être repensées : elles frappent les plus démunis

Source: The Conversation – in French – By Sylvanus Kwaku Afesorgbor, Associate Professor of Agri-Food Trade and Policy, University of Guelph

Les sanctions économiques sont largement considérées par les universitaires et les décideurs politiques comme une alternative préférable aux interventions militaires pour faire pression sur les gouvernements afin qu’ils modifient leurs politiques contestables. L’idée est simple : au lieu d’utiliser les armes, il faut exercer une pression économique sur l’élite au pouvoir jusqu’à ce qu’elle change de comportement.

Le recours aux sanctions économiques n’a cessé d’augmenter. Selon les données récentes de la Global Sanctions Database, le nombre de sanctions actives a augmenté de 31 % en 2021 par rapport à 2020, et cette tendance à la hausse s’est poursuivie en 2022 et 2023.

En Afrique, plusieurs pays font actuellement l’objet de sanctions imposées par les États-Unis, les Nations unies ou l’Union européenne. Parmi ces États africains figurent la République centrafricaine, la République démocratique du Congo, la Guinée, la Guinée-Bissau, le Mali, la Libye, la Somalie, le Soudan du Sud et le Zimbabwe. Ce n’est pas une simple coïncidence si la plupart de ces pays figurent parmi les foyers de faim identifiés par le Programme alimentaire mondial.

Les sanctions peuvent avoir des conséquences imprévues pour les citoyens et ce sont généralement eux qui en paient le prix. Lorsque les sanctions touchent les systèmes alimentaires, l’impact peut être dévastateur.

J’étudie les sanctions économiques et leurs effets négatifs imprévus sur les pays en développement.

Dans une étude récente menée avec mes collègues, nous avons analysé l’impact des sanctions économiques sur la sécurité alimentaire dans 90 pays en développement entre 2000 et 2022. Nous voulions explorer les liens potentiels entre les sanctions et la famine dans un contexte de préoccupation mondiale croissante concernant l’insécurité alimentaire.

Nous nous sommes concentrés sur deux indicateurs clés : les prix des denrées alimentaires et la sous-alimentation (c’est-à-dire la proportion de personnes qui ne consomment pas suffisamment de calories pour mener une vie saine).

Nous avons mesuré les prix des denrées alimentaires à l’aide de l’indice des prix à la consommation des denrées alimentaires de l’Organisation des Nations unies pour l’alimentation et l’agriculture. Cet indice reflète l’évolution du coût global des denrées alimentaires et des boissons non alcoolisées généralement achetées par les ménages.

Nous avons également utilisé le calcul de la prévalence de la sous-alimentation établi par la même organisation. Il s’agit d’un indicateur clé de l’objectif de développement durable 2.1, qui suit les progrès accomplis dans le monde pour éliminer la faim d’ici 2030.

Nos résultats donnent à réfléchir. Lorsque des sanctions sont en place, les prix des denrées alimentaires augmentent d’environ 1,2 point de pourcentage par rapport aux périodes sans sanctions. Cela peut sembler peu, mais dans les pays à faible revenu où les familles consacrent la moitié de leurs revenus à l’alimentation, même de légères augmentations rendent la vie plus difficile. Cela ne tient pas compte d’autres facteurs externes pouvant entraîner des hausses de prix, tels que les modèles de demande et d’offre.

Nous avons également constaté que la sous-alimentation augmente de 2 points de pourcentage pendant les périodes de sanctions. Pour les pays où des millions de personnes vivent déjà au bord de la famine, cela représente un fardeau supplémentaire considérable.

Pourquoi les sanctions aggravent l’insécurité alimentaire

Les sanctions ont des répercussions sur les économies de plusieurs façons, et l’alimentation est souvent prise entre deux feux.

Tout d’abord, les sanctions perturbent les importations alimentaires. Il s’agit d’un problème crucial pour de nombreux pays en développement qui dépendent fortement des marchés internationaux pour nourrir leur population. Entre 2021 et 2023, les importations alimentaires de l’Afrique ont totalisé environ 97 milliards de dollars américains.

Au niveau national, par exemple, l’Éthiopie et la Libye ont importé pour 3 milliards de dollars de denrées alimentaires, le Soudan pour 2,3 milliards et la République démocratique du Congo pour 1,2 milliard. Les sanctions peuvent restreindre davantage le commerce ou augmenter les coûts de transport, rendant les denrées alimentaires à la fois plus rares et plus chères.

Deuxièmement, les sanctions limitent l’accès aux intrants agricoles essentiels, tels que les engrais, les pesticides et les machines. Elles entravent également les transferts de technologie. Par exemple, les agriculteurs d’Afrique subsaharienne n’utilisent en moyenne que 9 kg d’engrais par hectare de terres arables, contre 73 kg en Amérique latine et 100 kg en Asie du Sud. Ces contraintes réduisent les rendements, augmentent les coûts de production et rendent plus difficile le maintien de la production pour les agriculteurs.

Troisièmement, les sanctions ébranlent les systèmes financiers, réduisent les revenus des populations et encouragent la thésaurisation. Les ménages qui disposent déjà d’un budget serré sont contraints de réduire leurs dépenses ou de se tourner vers des aliments moins chers et moins nutritifs.

Enfin, les sanctions entraînent souvent une réduction de l’aide alimentaire, car les pays visés perdent leur accès à l’aide internationale. Par exemple, la récente suspension de l’aide humanitaire américaine au Soudan a contraint 80 % des cuisines d’urgence du pays à fermer. Cet impact est particulièrement grave étant donné que certains des plus grands donateurs alimentaires, tels que les États-Unis et l’Union européenne, sont également parmi les pays qui recourent le plus fréquemment aux sanctions.

Le résultat final est simple : des prix alimentaires plus élevés, moins de nourriture sur la table et plus de famine.

Toutes les sanctions ne se valent pas

Nous avons également constaté que le type de sanction a son importance :

  • Les sanctions commerciales qui bloquent les importations et les exportations sont celles qui font le plus augmenter les prix des denrées alimentaires. Les sanctions financières qui gèlent les avoirs ou coupent l’accès aux services bancaires sont également préjudiciables, car elles perturbent indirectement le commerce agricole.

  • Lorsque les pays sont confrontés à la fois à des sanctions commerciales, financières et de circulation, les dommages sont considérables : les prix des denrées alimentaires augmentent de plus de 3,5 points de pourcentage et la faim augmente fortement.

  • L’identité de l’auteur des sanctions a également son importance. Les sanctions de l’Union européenne ont entraîné la plus forte hausse des prix alimentaires, tandis que celles de l’ONU ont eu le plus grand impact sur la faim, augmentant la sous-alimentation de près de 6 points de pourcentage.

La nourriture comme arme de guerre

L’ONU met en garde depuis des années contre l’utilisation de la nourriture comme arme. En 2018, la résolution 2417 a explicitement condamné la famine comme outil de guerre ou de pression politique. Pourtant, dans la pratique, les sanctions restreignent souvent l’accès à la nourriture, aux médicaments et aux intrants agricoles, même lorsque des « exemptions humanitaires » existent sur le papier.

L’insécurité alimentaire en Afrique s’aggrave. Selon l’Organisation mondiale de la santé, une personne sur cinq sur le continent est confrontée à la faim, et le nombre de personnes sous-alimentées continue d’augmenter. Les sanctions aggravent cette crise.

Et le dilemme moral est évident. Les personnes les plus touchées – les familles pauvres, les petits agriculteurs et les enfants – sont celles qui sont les moins responsables du comportement qui déclenche les sanctions.

Si les sanctions visent à punir les régimes, elles punissent souvent les citoyens ordinaires à la place.

Ce qui doit changer

Les sanctions ne sont pas près de disparaître de la politique mondiale. Mais leur conception et leurs conséquences humanitaires doivent être repensées. Trois mesures pourraient réduire les dommages :

  • Premièrement, renforcer les exemptions humanitaires : veiller à ce que les denrées alimentaires, les engrais et l’aide puissent circuler librement, sans être bloqués.

  • Deuxièmement, suivre l’impact des sanctions : les agences internationales telles que l’Organisation des Nations unies pour l’alimentation et l’agriculture (FAO) et le Programme alimentaire mondial (Pam) devraient surveiller l’impact des sanctions sur les systèmes alimentaires et tirer rapidement la sonnette d’alarme.

  • Troisièmement, repenser la stratégie : si les sanctions finissent par alimenter la faim, l’instabilité et les migrations, elles peuvent faire plus de mal que de bien à long terme.

Si le monde souhaite réellement éradiquer la faim d’ici 2030, il ne peut ignorer les conséquences imprévues des sanctions. Celles-ci doivent donc être repensées afin de protéger les plus vulnérables, sans quoi elles risquent de devenir non seulement un outil diplomatique, mais aussi un facteur de crises alimentaires.

The Conversation

Sylvanus Kwaku Afesorgbor reçoit un financement du ministère de l’Agriculture, de l’Alimentation et de l’Agroalimentaire de l’Ontario (OMAFA). Kwaku est également consultant occasionnel pour la Banque africaine de développement et le Consortium africain de recherche économique. Il est le fondateur exécutif du groupe de réflexion international Centre for Trade Analysis and Development (CeTAD Africa), basé à Accra, au Ghana.

ref. Les sanctions économiques doivent être repensées : elles frappent les plus démunis – https://theconversation.com/les-sanctions-economiques-doivent-etre-repensees-elles-frappent-les-plus-demunis-265928

Travailler assis ou debout ? Pour la productivité et pour la santé, mieux vaut alterner

Source: The Conversation – in French – By Cédrick Bonnet, Chargé de recherche CNRS, spécialiste dans l’influence des positions du corps sur Comportement, Cognition et Cerveau, Université de Lille

Des travaux de recherche révèlent que la position debout améliore l’attention visuelle. Ce constat plaide en faveur de l’alternance des stations assise et debout au cours de la journée de travail. Prendre cette habitude permettrait aussi de lutter contre les effets délétères pour la santé résultant du maintien sur une trop longue période de l’une ou l’autre des deux positions.


Depuis la fin de la Deuxième Guerre mondiale, la mécanisation, le confort accru, l’ordinateur, Internet et le télétravail entre autres ont considérablement augmenté le temps que nous passons assis.

Aujourd’hui, plus de la moitié de la population de la planète est quotidiennement assise plus de 50 % du temps, ce qui représente plus de 8 heures par jour. Plus grave encore, les personnes qui travaillent en bureau sont assises entre 65 et 85 % de leur journée, ce qui représente de 11,2 à 12,8 heures par jour, et le temps passé assis continuera à augmenter au moins jusqu’à 2030.

On sait que ces changements ne sont pas sans conséquence pour notre santé. Il a notamment été démontré que l’augmentation de la sédentarité est associée à un risque accru de maladie cardiovasculaire, de diabète de type 2, de cancer, d’obésité, d’anxiété ou de dépression. Mais ce n’est pas tout, il semblerait que notre position ait également un impact sur notre productivité.

Dans notre équipe de recherche, au SCALab, nous avons émis l’hypothèse qu’une personne en bonne santé (autrement dit, dans ce contexte, ne rencontrant pas de problème pour se tenir debout) devrait réaliser de meilleures performances debout qu’assise, ceci tant que la fatigue en position debout n’est pas excessive. Les résultats que nous avons obtenus jusqu’ici le confirment. Explications.

Notre posture influe-t-elle sur notre efficacité ?

Au cours d’une journée, les individus adoptent trois types de position du corps : allongée pour dormir, debout pour bouger et plus ou moins pliée, pour s’asseoir notamment.

Pour comprendre pourquoi nous pourrions être plus efficaces debout qu’assis, il faut savoir que, pour fonctionner au mieux, nos systèmes sensoriels et attentionnels ont besoin de stimulations, d’accélérations et de perturbations. Or, si, en position debout, notre corps oscille sans arrêt et doit continuellement contrôler son équilibre pour ne pas tomber, en position assise, il n’est pas perturbé de la sorte.

Ces dernières années, notre équipe de recherche a conduit plusieurs projets de recherche pour valider notre hypothèse. Dans un article récemment accepté pour publication, nous avons demandé à 24 jeunes adultes de réaliser six fois une tâche d’attention (« Attention Network Test ») en alternant les positions du corps (assis, debout), et six fois cette même tâche seulement en position assise.

Les résultats obtenus montrent que l’attention visuelle des participants est meilleure en alternant les positions du corps qu’en restant tout le temps assis. En outre, les participants réalisaient la tâche plus rapidement (avec des temps de réaction plus courts) quand ils étaient debout dans la condition d’alternance.

Dans un second article récent, nous avons demandé à 17 jeunes adultes sains de réaliser la même tâche d’attention (« Attention Network Task ») soit assis, soit debout. Nous avons testé si c’était le fait de devoir contrôler et ajuster l’équilibre debout, et pas uniquement le fait d’être debout, qui peut expliquer de meilleures performances debout qu’assis.

Les analyses ont effectivement montré que plus les oscillations des participants étaient complexes, plus leur temps de réaction était raccourci (corrélation de Pearson négative significative) et plus leur score d’alerte (tiré de la tâche d’attention) était élevé. Par définition, le score d’alerte reflète la capacité d’un individu à préparer et à maintenir un état d’alerte afin de répondre rapidement à un stimulus attendu.

Ces résultats indiquent que les individus complexifient leurs oscillations posturales debout de façon à améliorer leur performance à la tâche d’attention. On peut supposer qu’en position assise, tout individu serait moins (voire, ne serait pas) capable de complexifier ses oscillations justement parce qu’il n’a pas à contrôler son équilibre.

En 2024, nous avons demandé à 24 jeunes adultes sains de réaliser une tâche de Stroop modifiée dans quatre positions du corps différentes : debout contre un mur ; debout naturellement avec les pieds serrés ; avec les pieds normalement écartés ; avec les pieds légèrement plus écartés que d’habitude.

Les résultats obtenus révèlent l’existence d’une corrélation significative entre le nombre de cibles bien trouvées dans cette tâche de Stroop et des variables d’oscillations de la tête et du centre de pression (le point d’application de la résultante des forces de réaction au sol exercées par les pieds sur le sol). Autrement dit, plus les participants oscillaient (en rapidité et en amplitude) et meilleure était leur performance à bien trouver les cibles dans cette tâche de Stroop modifiée.

En résumé, ces trois études menées en laboratoire sont en accord avec notre hypothèse initiale. La position debout optimise la performance à des tâches d’attention visuelle de courtes durées.

Nos résultats sont également en ligne avec ceux d’autres scientifiques. En effet, plusieurs investigateurs ont déjà montré que les performances aux tâches d’attention et de Stroop modifiées étaient meilleures debout qu’assis. En outre, d’autres études ont révélé que l’alternance assis-debout amène des résultats significativement meilleurs que la position assise seule. Enfin, des travaux ont montré que la productivité à long terme était meilleure en alternant les positions assise et debout.

En effet, l’attention décline de plus en plus à mesure que l’on reste assis, alors qu’elle reste plus élevée en position debout – surtout dans les trente premières minutes d’une tâche. Ce résultat est important, car il suggère que l’individu ne devient pas meilleur en étant debout qu’en restant assis, mais plutôt qu’il évite un déclin de performance en se mettant debout.

Vaut-il mieux travailler debout ou assis ?

La majorité des travaux publiés révèle que les performances en position assise sont identiques à celles en position debout quand les tâches durent moins de dix minutes. En revanche, les performances en position debout peuvent être meilleures qu’en position assise lorsque les tâches durent entre dix et trente minutes. Entre trente minutes et une heure et demie (soit quatre-vingt-dix minutes), les performances dans les deux positions redeviennent équivalentes. Au-delà d’une heure et demie, les performances devraient logiquement être moins bonnes debout qu’assis, mais aucune recherche ne l’a encore montré jusqu’à présent à notre connaissance.

Pour toutes ces raisons, selon nous, la meilleure dynamique posturale à adopter pour optimiser les performances et la productivité est d’alterner fréquemment les positions du corps, en les maintenant chacune de 15 à 30 ou 45 minutes.

Il faut souligner que la dynamique posturale a non seulement des conséquences sur la performance et la productivité, mais aussi sur la santé. On savait déjà que rester debout excessivement longtemps est très problématique pour la santé. Depuis une vingtaine d’années plus particulièrement, les travaux de recherche ont aussi révélé que la station assise excessive est également très problématique pour la santé.

Elle accroît en effet le risque de mort prématurée, ainsi que celui d’être affecté par diverses maladies chroniques graves : cancer, diabète, maladies inflammatoires, musculaires, vasculaires chroniques, attaque cardiaque).

La station assise excessive a aussi été associée à une augmentation du surpoids et de l’obésité, ainsi qu’au développement des troubles du sommeil et à des problèmes cognitifs. Par ailleurs, on sait qu’être sédentaire a également des effets sur le psychisme, accroissant non seulement le risque de dépression, mais aussi celui d’avoir une moins bonne vitalité au travail.

Notre synthèse de la littérature révèle que pour limiter le risque de survenue de ces problèmes de santé, les individus devraient tous les jours rester quasiment autant debout qu’assis – autrement dit, ils devraient passer 50 % du temps debout.

Alterner au fil de la journée les postures assises et debout toutes les 15 à 45 minutes permettrait non seulement d’améliorer la productivité, mais aussi de réduire les conséquences pour la santé. Cela permet en effet d’augmenter le temps passé debout tout au long de la journée en évitant au mieux la fatigue.

Pour y parvenir, il faudrait équiper les travailleurs de bureaux assis/debout. Ceux-ci sont déjà adoptés dans de nombreux pays dans le monde, notamment aux États-Unis, au Canada, en Australie, en Chine ou en Europe du Nord. Pour aider les utilisateurs à adopter une dynamique posturale bénéfique ou optimale, l’emploi de tels bureaux devrait être couplé à une application « assis-debout » destinée à guider les utilisateurs. Malheureusement, l’offre en matière d’applications – y compris celle proposée par des objets connectés tels que montres, smartphones ou bureaux connectés – est encore imparfaite à l’heure actuelle. Afin d’y remédier, notre équipe est en train de développer une telle application.

The Conversation

Cédrick Bonnet ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.

ref. Travailler assis ou debout ? Pour la productivité et pour la santé, mieux vaut alterner – https://theconversation.com/travailler-assis-ou-debout-pour-la-productivite-et-pour-la-sante-mieux-vaut-alterner-265209

The crisis of Indonesian policing: Guardians of the people or protectors of power?

Source: The Conversation – Indonesia – By Perdian Tumanan, PhD Candidate in Ethics and Religion, Villanova School of Law

The death of 21-year-old Indonesian online delivery driver Affan Kurniawan, who was crushed by a Barracuda police vehicle during a protest, invites comparisons to George Floyd. The African American was killed by a police officer in Minneapolis in 2020, sparking global Black Lives Matter protests.

One thing unites both cases: they reflect arbitrary violence by those sworn to protect the people.

Policing in the two countries differs greatly in its context: while police in the US are deeply tied to political elites and economic power, the Indonesian police were, in principle, established to serve the people.

Yet this shared pattern of civilian killings raises a pressing question: who are the Indonesian police really protecting?

American police: Guardians of the elites

Alex Vitale, a leading scholar on American policing, argues in his influential book The End of Policing that the roots of American policing are inseparable from three foundational systems of inequality in the 18th century: slavery, colonialism, and the control over the emerging industrial working class.

The commemoration of George Floyd’s killing by Minneapolis police at Union Square in Midtown Manhattan. May 25, 2025.
Christopher Penler

In other words, the American police were not originally created to reduce or prevent crime. Rather, as in many Western nations, they emerged to protect elite interests and maintain control over the working class.

The police thus functioned less as protectors of the public and more as instruments of social control. In essence, they served political elites and economic powers in defending a status quo that favoured them — a legacy that continues to shape the U.S. legal and security system today.

According to Vitale, in such a society, equality existed only among the elites, while the broader nation became a policed society. In this context, crime was defined less by moral conduct than by a person’s socio-economic standing.

For the enslaved and the poor, political rights were nonexistent, and even free expression was unimaginable. Any protest against this imposed order was swiftly branded a crime and harshly punished.

Vitale argues that law enforcement is increasingly armed not only to control the public and instil fear, but also to shield themselves from their own fear of being attacked by the very people they are meant to protect.

This cycle of reciprocal fear persists because the police are never truly connected to, nor in solidarity with, the communities they police.

In such a system, fear governs daily life. It does not flow in just one direction but operates reciprocally — something clearly reflected in the very equipment the police carry.

Indonesian police: Once protectors of the people

I once asked my professor why police officers in the United States always carry guns, even in sacred spaces like churches.

I explained that in Indonesia, the mere presence of guns — regardless of who carries them — instils fear

The Indonesian Police's Mobile Brigade Unit spark public outcry after an officer runs over a civilian, killing him.
Protesters gather in front of the Mobile Brigade Police Headquarters, Jakarta, August 29, 2025.
Wulandari Wulandari/Shutterstock

He replied that police carry guns as a precaution, driven by fear of being attacked.

I then showed him photos of Indonesian police mingling freely with civilians, unarmed and unafraid. Intrigued, he asked how this was possible.

I explained that while Indonesian policing partly inherited its structure from the Dutch colonial apparatus, but it earned legitimacy during the nationalist struggle for independence.

In that struggle, the early police were closely tied to ordinary people, giving them a sense of belonging to the society they served.

This rootedness set them apart from the militarised culture of Western policing, where trust is absent. In Indonesia, the police and the people are inseparable — essentially one.

The social media post comparing Affan’s death to that of George Floyd raises a deeper question: whom do the Indonesian police truly serve today?

Once seen as protectors of the people, the police now increasingly appear aligned with elite interests. Public dissatisfaction is growing, fueled by recurring patterns of violence used to silence dissent, facilitate land dispossession, and suppress indigenous communities.

This perception is further reinforced by the conspicuous wealth displayed by some officers and their families, raising serious questions about integrity and accountability.

These realities deepen a crisis of trust, eroding the very foundations of police legitimacy in a democratic society.

A tool of repression

Affan’s death starkly symbolizes the police’s shift from protecting the people to serving elite interests — a perception reinforced when President Prabowo Subianto, instead of apologising or holding the institution accountable, chose to promote the officers who oversaw the protest.

Indonesian police are involved in a riot with protesters.
A police officer directing traffic at the Tugu Jogja intersection in Yogyakarta.
Rembolle/Shutterstock

Such actions deepen public wounds and confirm suspicions that the police now serve rulers rather than citizens. If this course continues, they will stray even further from their democratic mandate and erode the very trust on which their legitimacy rests.

The Indonesian police must reflect on their roots in the people and heed Vitale’s reminder: policing should not merely serve as a tool of elite power and crime control, but as a force rooted in morality and ethical authority.

If the police forget their roots among the people, they risk ceasing to be guardians of justice and becoming nothing more than guardians of power.

At that point, public trust will collapse. The democratic mandate that once gave birth to the police will be hollowed out, and the institution will no longer be seen as a friend of the people, but as an instrument of repression.

If Indonesia’s police do not have the courage to return to their true calling, the gulf between them and the people will only deepen, leaving behind an institution stripped of legitimacy.

The Conversation

Perdian Tumanan tidak bekerja, menjadi konsultan, memiliki saham, atau menerima dana dari perusahaan atau organisasi mana pun yang akan mengambil untung dari artikel ini, dan telah mengungkapkan bahwa ia tidak memiliki afiliasi selain yang telah disebut di atas.

ref. The crisis of Indonesian policing: Guardians of the people or protectors of power? – https://theconversation.com/the-crisis-of-indonesian-policing-guardians-of-the-people-or-protectors-of-power-264342

Mushrooms may have been part of early human diets: primate study explores who eats what and when

Source: The Conversation – Africa (2) – By Alexander Piel, Asso. Professor in Anthropology, University College London, UCL

Mushrooms may not be the first food that comes to mind when we imagine the diets of wild primates – or our early human ancestors. We tend to think of fruits and green leaves as the preferred foods for monkeys and apes.

But our new study from the Issa Valley in western Tanzania highlights a surprising, and potentially crucial, role for fungi in primate diets.

For nearly two decades, our work has centred on what it means to be a savanna-woodland primate in east Africa. Far from their forest-dwelling cousins, these populations are exposed to higher temperatures, as well as woodland and grassland vegetation where they can find food – or be in danger from predators like wild dogs and hyenas.

Broadly, we are interested in competition between species. For example, how do baboons and smaller monkeys avoid larger (and predatory) chimpanzees when looking for ripe fruits? Mushrooms may provide an answer.

We found that while all three primate species under study consumed mushrooms, their use and reliance differed throughout the year. Mushrooms were seasonally important for red-tailed monkeys and chimpanzees, becoming a fall-back food when ripe fruit was scarce, despite overall making up only 2% of their diet. For baboons, mushrooms were a preferred food, with fungi forming more than a tenth of their diet despite being available for only half the year.

Our findings not only shed light on the way that primates rely on and respond to their environment, but also hint at the evolutionary roots of human mycophagy (mushroom eating). Fungi have been overlooked in research into ancient diets because they don’t fossilise well and leave little trace in the archaeological record.

By examining which foods are consumed by primates, we can better reconstruct scenarios of how early human species may have competed with one another.

Issa fungi foraging

Over four years, we observed three co-inhabiting species – chimpanzees, yellow baboons and red-tailed monkeys – regularly consuming mushrooms.

We used over 50,000 observations of feeding among the three species and found that mushroom consumption wasn’t just incidental. While chimpanzees and red-tailed monkeys ate mushrooms mostly during the wet season, when availability peaked, baboons consumed mushrooms far longer, even when they were relatively scarce.

In fact, for two months of the year, mushrooms made up over 35% of baboons’ diets, suggesting they are a preferred food, not just consumed during fruit-scarce periods, as we suggest for the chimpanzees and red-tailed monkeys.

Chimpanzees and red-tailed monkeys, in contrast, treated mushrooms as a seasonal supplement, valuable when fruits were less abundant. This nuanced difference suggests that mushrooms play different roles within this primate community, depending on ecological strategies and competition dynamics.

Avoiding conflict through fungi

One of the most intriguing ideas to emerge from our study is the concept of niche partitioning: how animals adapt their diets to minimise competition. This is a well-established phenomenon which can manifest in various ways, from bird species occupying different canopy heights, to carnivores targeting different prey.

In habitats where multiple species coexist, finding one’s own food niche can be the key to survival. At Issa, baboons, chimpanzees and guenons (monkeys) might all be using mushrooms in strategic ways to improve feeding efficiency and reduce tension with each other as they respond to periods when (preferred) ripe fruits are insufficient for all three species.

What does this mean for us?

The implications of these findings stretch far beyond western Tanzania. First, they highlight how mushrooms can serve as a rich, seasonal food source, even for large mammals, providing protein, micronutrients and potentially medicinal benefits. This lends support to theories that fungi may have played a significant role in the diets of early hominins.

In fact, the habitat of Issa is thought to resemble the kind of mosaic woodland landscape where human ancestors evolved. If our primate relatives today are exploiting fungi in this environment, it’s plausible that Australopithecus, Homo habilis and other early human species did too.

Despite this, fungi are often overlooked in reconstructions of ancient diets, largely because they don’t fossilise well and leave little trace. Yet ancient DNA from Neanderthal dental plaque from about 40,000 years ago has revealed traces of mushrooms, tantalising clues that fungi may have been more central to prehistoric life than previously believed.

A caution and a call

The study also raises important questions about human-wildlife coexistence. In many parts of Tanzania, mushrooms are harvested by people and sold in local markets. As climate change and human population growth put pressure on wild resources, competition between humans and wildlife over edible fungi may increase. Understanding who eats what and when could help in managing these shared resources sustainably.

At a time when biodiversity is under threat and food security is a growing global concern, this research reminds us that hidden treasures like wild mushrooms aren’t just tasty; they’re significant for ecology and evolution.

Fungi can add to our understanding of where we came from and how we might share our ecosystems going forward.

The Conversation

Alexander Piel receives funding from the Salk/UCSD Center for Academic Research and Training in Anthropogeny and the Department of Human Origins, Max Planck Institute for Evolutionary Anthropology. He is an associate researcher with MPI-EVA.

Fiona Stewart receives funding from the Salk/UCSD Center for Academic Research and Training in Anthropogeny and the Department of Human Origins, Max Planck Institute for Evolutionary Anthropology. He is an associate researcher with MPI-EVA.

ref. Mushrooms may have been part of early human diets: primate study explores who eats what and when – https://theconversation.com/mushrooms-may-have-been-part-of-early-human-diets-primate-study-explores-who-eats-what-and-when-264089

Civil society helps uphold democracy and provides built-in resistance to authoritarianism

Source: The Conversation – USA – By Christopher Justin Einolf, Professor of Sociology, Northern Illinois University

Alex Soros is the board chair of the Open Society Foundations, the philanthropy funded by his father, George Soros. AP Photo/Manuel Balce Ceneta

The New York Times reports that a senior Department of Justice official recently “instructed more than a half dozen U.S. attorneys’ offices to draft plans to investigate” the Open Society Foundations – philanthropies funded by the billionaire George Soros.

Citing a document that the news outlet said its reporters had seen, the report listed possible charges the foundations could face “ranging from arson to material support of terrorism.”

The philanthropic institution denied any wrongdoing.

“These accusations are politically motivated attacks on civil society, meant to silence speech the administration disagrees with and undermine the First Amendment right to free speech,” Open Society Foundations stated in response to the reported investigations. “When power is abused to take away the rights of some people, it puts the rights of all people at risk.”

The term “civil society” isn’t familiar to all Americans. But it’s part of what helped this country grow and thrive because it encompasses many of the institutions that uphold the American way of life. As a sociologist who studies nonprofits and civil society in the U.S and around the world, I have always been interested in the relationship between the health of a nation’s civil society and the strength of rights and freedom within its borders.

I’ve also noticed that often the term is used without a definition. But I think that it’s important for Americans to become more familiar with what civil society is and how it helps sustain democracy in the United States.

Civil society

The Encyclopedia Britannica defines civil society as “the dense network of groups, communities, networks and ties that stand between the individual and the modern state.”

This constellation of institutions consists of not-for-profit organizations and special interest groups, either formal or informal, working to improve the lives of their constituents. It includes charitable groups, clubs and voluntary associations, churches and other houses of worship, labor unions, grassroots associations, community organizations, foundations, museums and other kinds of nonprofits – including nonprofit media outlets.

Civil society does not include government agencies or for-profit businesses.

Political scientists and sociologists have long claimed that a healthy civil society, which in the U.S. includes a strong and independent nonprofit sector, helps sustain democracy. This is true even though most nonprofits don’t engage in partisan political activities.

My own analysis of survey data from 64 countries has shown that authoritarians have begun to use civil society groups to support their own purposes. But in the United States, at least, most civil society organizations still support democratic values.

Sometimes, scholars call civil society “the third sector” to distinguish it from the public and private spheres.

Most scholars agree that civil society strengthens and protects democracy, and that true democracy is impossible without it. These scholars distinguish between liberal democracies and illiberal democracies.

Liberal democracies have a separation of powers – meaning the executive, legislative and judicial branches of government. They protect individual rights, allow a free press, maintain an independent judiciary and safeguard the rights of minorities.

In illiberal democracies, there are periodic elections, but they are not necessarily fair or free. Civil society tends to be more restricted in illiberal democracies than in liberal ones.

An American strength from the start

The strength of America’s civil society helps explain the long success of democracy in the United States.

In 1835, when the French scholar and diplomat Alexis de Tocqueville visited the country, he marveled at the tendency of Americans to “constantly unite.” They created associations, he wrote, “to give fêtes, to found seminaries, to build inns, to raise churches, to distribute books, to send missionaries to the antipodes; in this manner they create hospitals, prisons, schools.”

Whereas the government initiated grand projects in France and the nobility did so in England, in the United States voluntary associations of ordinary individuals were behind most great endeavors.

People in periwinkle blue T-shirts stand while children sit on the ground, surrounded by dogs.
A Lutheran group that provides comfort dogs after traumatic events visits survivors of a school shooting in Minneapolis on Aug. 28, 2025.
AP Photo/Abbie Parr

What happens in nondemocratic countries

One way to see how important a robust civil society can be is to look at what happens in countries that do not have one.

The totalitarian countries of the 20th century, particularly communist China and the Soviet Union, outlawed civil society under the pretense that the party and the state represented the people’s true interests.

When the Berlin Wall fell in 1989, the United States and Western Europe devoted much diplomacy and foreign aid to helping the former USSR and the countries of Eastern Europe develop civil society institutions, believing this to be a precondition of those countries’ transition to democracy.

Today, civil society flourishes in formerly communist nations that have successfully made the transition to democracy, such as the Czech Republic, Estonia, Latvia and Lithuania. Civil society is restricted in that region’s countries that don’t embrace democracy, such as Belarus and Russia.

A man fixes a bicycle.
Volunteer Clayton Streich fixes a bicycle at Lincoln Bike Kitchen, an American nonprofit, in 2024 in Lincoln, Neb.
AP Photo/Rebecca S. Gratz

Not your grandma’s authoritarians

Today’s authoritarian rulers realize that civil society has the potential to support democracy and pry loose their grip on power. But few of those leaders outlaw civil society organizations entirely.

Instead, authoritarian leaders subordinate civil society organizations to achieve their own ends. In China, which had no civil society before the 1990s, the Communist Party now creates government-organized nongovernmental organizations, or GONGOs, which look like nonprofits and are technically separate from the state, but remain under state control.

Some authoritarians who take power in countries that already have a civil society sector tame these organizations and harness their power through a range of oppressive tactics. They leave alone service-providing organizations, like food banks, free clinics and homeless shelters, and use them to show citizens how they are bringing them benefits.

However, they crack down on advocacy organizations, such as human rights groups, labor unions and feminist groups, as these are a source of potential opposition to the regime. They then cultivate pro-regime civil society institutions, providing them with formal and informal support.

When authoritarians crack down on civil society groups, they sometimes destroy offices and imprison the organization’s leaders and members of their staff. But they generally use more subtle means.

For example, they may pass laws restricting the amount of funding, particularly foreign funding, available to nonprofits. They add layers of red tape that make it hard for nonprofits to operate, such as audits, registration requirements and information requests.

Authoritarians may use those hurdles selectively. Nonprofits that are neutral or friendly to the regime may find they can operate freely. Nonprofits the regime perceives as opponents undergo extensive audits, are forced to wait a long time when they seek to incorporate, and face constant demands for personal information about their funders, members and clients.

Man holding a sign with Vladimir Putin's face on it hands out newspapers.
An activist of the pro-Kremlin National Liberation Movement hands out materials while holding a sign that includes a portrait of Russian President Vladimir Putin in Moscow.
Getty Images

Attacks in the United States

Even before news broke of the Trump administration’s reported demand that the Open Societies Foundations be investigated, there were mounting signs that the U.S. was becoming more like authoritarian countries than it used to be in terms of how it treats civil society.

In March 2025, for example, President Donald Trump signed an executive order restricting a federal program that forgives student loans for people who work in public service organizations or the government. The order said that employees of institutions that the Trump administration deems to “have a substantial illegal purpose,” such as providing services to undocumented immigrants or serving the needs of transgender clients, would become ineligible for loan forgiveness.

Over the summer, Congress held three investigative hearings on nonprofits. The Republican Party’s leadership signaled its disdain and distrust of those groups with hearing titles like “Public Funds, Private Agendas: NGOs Gone Wild, ”How Leftist Nonprofit Networks Exploit Federal Tax Dollars to Advance a Radical Agenda,“ and “An Inside Job: How NGOs Facilitated the Biden Border Crisis.”

After the murder of conservative activist Charlie Kirk, Vice President JD Vance threatened “to go after the NGO network that foments, facilitates and engages in violence,” including the Ford Foundation and the Open Society Foundations, despite the fact that there is no evidence that these organizations support violence.

Some nonprofits have published open letters, issued public statements and provided congressional testimony in opposition to the administration’s claims.

What happens next is unclear. The threat to strip organizations of their nonprofit status may be an empty one, given that the Supreme Court has already ruled that doing so is regulated by law and the president cannot do it on a whim.

Many scholars of nonprofits are watching to see if the United States takes more steps down this road to authoritarianism, stays where it is or reverses course.

We are studying how America’s flourishing civil society resists any restrictions that limit the freedoms that have largely been taken for granted – until now.

The Conversation

Christopher Justin Einolf does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Civil society helps uphold democracy and provides built-in resistance to authoritarianism – https://theconversation.com/civil-society-helps-uphold-democracy-and-provides-built-in-resistance-to-authoritarianism-265705

What parents need to know about Tylenol, autism and the difference between finding a link and finding a cause in scientific research

Source: The Conversation – USA (3) – By Mark Louie Ramos, Assistant Research Professor of Health Policy and Administration, Penn State

In cases where associations are found, researchers must consider dosage response, differences between siblings and other factors to determine a cause-and-effect relationship. Ronaldo Schemidt/AFP via Getty Images

Claims from the Trump White House about links between use of the painkiller acetaminophen – often sold under the brand name Tylenol in the U.S. – during pregnancy and development of autism have set off a deluge of responses across the medical, scientific and public health communities.

As a father of a child with level 2 autism – meaning autism that requires substantial support – and a statistician who works with such tools as those used in the association studies cited by the White House, I find it useful to think about the nuances of association versus causation in observational studies. I hope that this explanation is helpful to parents and expecting parents who, like me, are deeply invested in the well-being of their children.

a bunch of white pills are shown with the words tylenol 500 on them in red
The painkiller acetominophen is often sold under the brand name Tylenol in the U.S.
AP Photo/Jae C. Hong

Association is not causation, but …

Most people have heard this before, but it bears repeating: Association does not imply causation.

An often-cited example is that there is a very strong association between ice cream sales and incidents of shark attacks. Of course, it goes without saying that shark attacks aren’t caused by ice cream sales. Rather, in the summertime, hot weather drives more appetite for ice cream and beach time. The increased number of people at the beach does, in turn, cause the likelihood of shark attacks to increase.

Yet pointing this out on its own is neither intellectually satisfying nor emotionally appeasing when it comes to real-life medical concerns, since an association does suggest potential for a causal relationship.

In other words, some associations do end up being convincingly causal. In fact, some of the most consequential discoveries of the past century in public health, like the links between smoking and lung cancer or the human papillomavirus (HPV) and cervical cancer, started out as findings of very strong association.

So when it comes to the issue of prenatal acetaminophen use and autism development, it is important to consider how strong the association found is, as well as the extent to which such an association could be considered causal.

Establishing causal association

So how do scientists determine if an observed association is actually causal?

The gold standard for doing so is conducting what are called randomized, controlled experiments. In these studies, participants are randomly assigned to receive treatment or not, and the environment where they are observed is controlled so that the only external element that differs among participants is whether they received treatment or not.

In doing this, researchers reasonably ensure that any difference in the outcomes of the participants can be directly attributed as being caused by whether they received the treatment. That is, any association between treatment and outcome can be considered causal.

Yet oftentimes, conducting such an experiment is impossible, unethical or both. For instance, it would be highly difficult to gather a cohort of pregnant women for an experiment and extremely unethical to randomly assign half of them to take acetaminophen, or any other medication for no particular reason, and the other half not to.

So when experiments are simply infeasible, an alternative is to make some reasonable assumptions on how observational data would behave if the association was causal and then see if the data aligns with these causal assumptions. This can very broadly be referred to as observational causal inference.

Parsing what the studies mean

So how does this apply to the current controversy over the potential for acetaminophen use during pregnancy to affect the fetus in a way that could result in a condition like autism?

Researchers who try to understand causal roles and links between one variable and potential health outcomes do so by considering: 1) the size and consistency of the association across multiple attempts to estimate it, and 2) the extent to which such association has been established under observational causal inference frameworks.

As early as 1987, researchers have been working to measure possible associations between acetaminophen use during pregnancy and autism. A number of these studies, including multiple large systematic reviews, have found evidence of such associations.

For instance, a 2025 review of 46 studies that examined association between acetaminophen use and an array of neurodevelopmental disorders, including autism, identified papers with five positive associations between acetaminophen and autism.

In one of those studies, which examined 73,881 births, the researchers found that children who were exposed to acetaminophen prenatally were 20% more likely to develop borderline or clinical autism spectrum conditions. Another examined 2.48 million births and reported an estimated association of only 5%.

Both of those are weak associations. For context, estimations of increased lung cancer risk from smoking in the 1950s were between 900% to 1,900%. That is, a smoker is 10 to 20 times more likely than a nonsmoker to develop lung cancer. By comparison, in the two autism studies above, a pregnant woman who takes acetaminophen is 1.05 to 1.20 times more likely than one who does not take the drug to have a child who would be later diagnosed with autism.

It’s also important to keep in mind that many factors can affect how well a study is able to estimate an association. In general, larger sample sizes provide both greater power to detect an association if one does exist, as well as improved precision over estimating the value of the association. This does not mean that studies with smaller sample sizes are not valid, only that from a statistical perspective, researchers like me place greater confidence in an association drawn from a larger sample size.

Once an association – even a small one – is established, researchers then must consider the extent to which causation can be claimed. One way to do this is through what’s called dose-response. This means looking at whether the association is higher among women who took higher doses of acetaminophen during pregnancy.

The study mentioned above that looked at 2.48 million births shows an example of dose-response. It found that pregnant women who reported taking higher doses have higher autism risk.

Another way to examine possible causality in this context is to analyze sibling outcomes, which that same paper did. Researchers looked at whether associations between acetaminophen and autism persisted within families with more than one child.

For example, in a family with two children, if the mother used acetaminophen during one pregnancy and that child was later diagnosed with autism, but she did not use it during the other pregnancy and that child was not diagnosed, then this strengthens the causal claim. Conversely, if acetaminophen was used during the pregnancy of the child who was not diagnosed with autism and not used during the pregnancy of the child who was, then that weakens the causal claim. When this was included in the analysis, the dose-response disappeared, and in fact the overall 5% increased risk mentioned before likewise disappeared. This weakens the claim of a causal relationship.

Consult your doctor

At present, there is clearly not enough evidence to establish a causal association between prenatal acetaminophen use and autism.

Yet as a parent who wonders if my daughter will ever be able to write her name, or hold a job or raise kids of her own, I understand that such explanations may not appease the fears or concerns of an expecting mother who is suffering from a fever.

Naturally, all of us want absolute certainty.

But that’s not possible when it comes to acetaminophen use, at least not at this time.

Your doctor will be able to provide you with much sounder advice than any existing study on this topic. Your OB-GYNs are very likely aware of these studies and have much better judgment as to how these results should be considered in the context of your personal medical history and needs.

Researchers, meanwhile, will continue to dig deeper into the science of this critically important issue and, hopefully, provide greater clarity in the years to come.

The Conversation

Mark Louie Ramos does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What parents need to know about Tylenol, autism and the difference between finding a link and finding a cause in scientific research – https://theconversation.com/what-parents-need-to-know-about-tylenol-autism-and-the-difference-between-finding-a-link-and-finding-a-cause-in-scientific-research-265946

Even a brief government shutdown might hamper morale, raise costs and reduce long-term efficiency in the federal workforce

Source: The Conversation – USA – By Gonzalo Maturana, Associate Professor of Finance, Emory University

A sign indicates the closing of federal services during the government shutdown in 2013. AP Photo/Susan Walsh

As the federal fiscal year draws to a close, an increasingly familiar prospect is drawing near in Washington, D.C.: a possible government shutdown. And for federal workers, it couldn’t come at a worse time.

In the fractious and polarized political landscape of the United States, Democrats and Republicans have come to rely on short-term, stopgap funding bills to keep the government operating in the absence of elusive longer-term budget deals.

With the parties currently wide apart over the terms of even a short-term budget resolution, the government is set to shut down on Oct. 1, 2025, barring an 11th-hour deal that appears far off. If the shutdown does happen, it would mark another difficult moment this year for a federal workforce that has so far shed more than 300,000 jobs. This is largely due to ongoing Trump administration efforts to downsize parts of the federal government and restructure or largely eliminate certain government agencies with the stated aim of increasing efficiency.

With a government shutdown, hundreds of thousands of federal employees would be furloughed – sent home without pay until funding resumes.

As a team of financial economists who study labor markets and public sector employment and have examined millions of federal personnel records spanning such government shutdowns in the past, we have found that the consequences reach far beyond the now-familiar images of closed national parks and stalled federal services. Indeed, based on our study of an October 2013 shutdown during which about 800,000 federal employees were furloughed for 16 days, shutdowns leave an enduring negative effect on the federal workforce, reshaping its composition and weakening its performance for years to come.

What happens to workers

Millions of Americans interact with the federal government every day in ways both big and small. More than one-third of U.S. national spending is routed through government programs, including Medicare and Social Security. Federal workers manage national parks, draft environmental regulations and help keep air travel safe.

Whatever one’s political leanings, if the goal is a government that handles these responsibilities effectively, then attracting and retaining a talented workforce is essential.

Yet the ability of the federal government to do so may be increasingly difficult, in part because prolonged shutdowns can have hidden effects.

When Congress fails to pass appropriations, federal agencies must furlough employees whose jobs are not deemed “excepted” – sometimes commonly referred to as essential. Those excepted employees keep working, while others are barred from working or even volunteering until funding resumes. Furlough status reflects funding sources and mission categories, not an individual’s performance, so it confers no signal about an employee’s future prospects and primarily acts as a shock to morale.

Importantly, furloughs do not create long-term wealth losses; back pay has always been granted and, since 2019, is legally guaranteed. Employees therefore recover their pay even though they may face real financial strain in the short run.

A cynical observer might call furloughs a paid vacation, yet the data tells a different story.

An empty hallway in the U.S. Capitol.
An American flag is seen inside the U.S. Capitol Building on Sept. 23, 2025, ahead of a looming government shutdown.
Photo by Anna Moneymaker/Getty Images

Immediate consequences, longer-term effects

Using extensive administrative records on federal civilian workers from the October 2013 shutdown, we tracked how this shock to morale rippled through government operations. Employees exposed to furloughs were 31% more likely to leave their jobs within one year.

These departures were not quickly replaced, forcing agencies to rely on costly temporary workers and leading to measurable declines in core functions such as payment accuracy, legal enforcement and patenting activity.

Further, we found that this exodus builds over the first two years after the shutdown and then settles into a permanently lower headcount, implying a durable loss of human capital. The shock to morale is more pronounced among young, female and highly educated professionals with plenty of outside options. Indeed, our analysis of survey data from a later 2018-2019 shutdown confirms that morale, not income loss, drives the exits.

Employees who felt most affected reported a sharp drop in agency, control and recognition, and they were far more likely to plan a departure.

The effect of the motivation loss is striking. Using a simple economic model where workers can be expected to value both cash and purpose, we estimate that the drop in intrinsic motivation after a shutdown would require a roughly 10% wage raise to offset.

Policy implications

Some people have argued that this outflow of employees amounts to a necessary trimming, a way to shrink government by a so-called starving of the beast.

But the evidence paints a different picture. Agencies hit hardest by furloughs turned to temporary staffing firms to fill the gaps. Over the two years after the shutdown we analyzed, these agencies spent about US$1 billion more on contractors than they saved in payroll.

The costs go beyond replacement spending, as government performance also suffers. Agencies that were more affected by the shutdown recorded higher rates of inaccurate federal payments for several years. Even after partial recovery, losses amounted to hundreds of millions of dollars that taxpayers never recouped.

Other skill-intensive functions declined as well. Legal enforcement fell in agencies that became short of experienced attorneys, and patenting activity dropped in science and engineering agencies after key inventors left.

Official estimates of shutdown costs typically focus on near-term GDP effects and back pay. But our findings show that an even bigger bill comes later in the form of higher employee turnover, higher labor costs to fill gaps, and measurable losses in productivity.

Shutdowns are blunt, recurring shocks that demoralize the public workforce and erode performance. These costs spill over to everyone who relies on government services. If the public wants efficient, accountable public institutions, then we should all care about avoiding shutdowns.

After an already turbulent year, it is unclear whether an upcoming shutdown would significantly add to the strain on federal employees or have a more limited effect, since many who were considering leaving have already left through buyouts or forced terminations this year. What is clear is that hundreds of thousands of federal employees are likely to experience another period of uncertainty.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Even a brief government shutdown might hamper morale, raise costs and reduce long-term efficiency in the federal workforce – https://theconversation.com/even-a-brief-government-shutdown-might-hamper-morale-raise-costs-and-reduce-long-term-efficiency-in-the-federal-workforce-265723

Nicolas Sarkozy condamné à 5 ans de prison : une normalisation démocratique ?

Source: The Conversation – France in French (3) – By Vincent Sizaire, Maître de conférence associé, membre du centre de droit pénal et de criminologie, Université Paris Nanterre – Université Paris Lumières

L’ancien président de la République Nicolas Sarkozy a été jugé coupable d’association de malfaiteurs dans l’affaire du financement libyen de sa campagne présidentielle victorieuse de 2007. Condamné notamment à cinq ans de prison, il sera convoqué le 13 octobre pour connaître la date de son incarcération. Cet événement inédit dans l’histoire de France s’inscrit dans une évolution des pratiques de la magistrature qui s’est progressivement émancipée du pouvoir politique. Elle couronne le principe républicain, proclamé en 1789, mais longtemps resté théorique, d’une pleine et entière égalité des citoyens devant la loi.


Le 25 septembre 2025, Nicolas Sarkozy a été reconnu coupable d’association de malfaiteurs par le tribunal correctionnel de Paris, qui a considéré qu’il avait tenu un rôle actif dans la mise en place d’un dispositif de financement de sa campagne électorale de 2007 par les dirigeants libyens. Comme on pouvait s’y attendre, cette décision a immédiatement suscité l’ire d’une large partie de la classe politique.

Que l’on conteste la décision en soutenant qu’elle est injuste et infondée, cela est parfaitement légitime dans une société démocratique, à commencer pour les principaux intéressés, dont c’est le droit le plus strict – comme, d’ailleurs, de faire appel du jugement. Mais, dans le sillage de la décision rendue dans l’affaire des assistants parlementaires du Front national, cette condamnation est aussi l’occasion, pour une large fraction des classes dirigeantes, de relancer le procès du supposé « gouvernement des juges ».

Certes, la condamnation peut paraître particulièrement sévère : 100 000 euros d’amende, cinq ans d’inéligibilité et surtout, cinq ans d’emprisonnement avec un mandat de dépôt différé qui, assorti de l’exécution provisoire, oblige le condamné à commencer d’exécuter sa peine de prison même s’il fait appel.

Toutefois, si on les met en regard des faits pour lesquels l’ancien chef de l’État a été condamné, ces peines n’apparaissent pas disproportionnées. Les faits sont d’une indéniable gravité : organiser le financement occulte d’une campagne électorale avec des fonds provenant d’un régime corrompu et autoritaire, la Libye, (dont la responsabilité dans un attentat contre un avion ayant tué plus de 50 ressortissants français a été reconnue par la justice), en contrepartie d’une intervention pour favoriser son retour sur la scène internationale…

Alors que la peine maximale encourue était de dix ans de prison, la sanction finalement prononcée ne peut guère être regardée comme manifestement excessive. Mais ce qui est contesté, c’est le principe même de la condamnation d’un responsable politique par la justice, vécue et présentée comme une atteinte intolérable à l’équilibre institutionnel.

Si l’on prend le temps de la mise en perspective historique, on constate pourtant que les jugements rendus ces dernières années à l’encontre des membres de la classe dirigeante s’inscrivent, en réalité, dans un mouvement d’émancipation relative du pouvoir juridictionnel à l’égard des autres puissances et, en particulier, du pouvoir exécutif. Une émancipation qui lui permet, enfin, d’appliquer pleinement les exigences de l’ordre juridique républicain.

L’égalité des citoyens devant la loi, un principe républicain

Faut-il le rappeler, le principe révolutionnaire proclamé dans la nuit du 4 au 5 août 1789 est celui d’une pleine et entière égalité devant la loi, entraînant la disparition corrélative de l’ensemble des lois particulières – les « privilèges » au sens juridique du terme – dont bénéficiaient la noblesse et le haut clergé. Le Code pénal de 1791 va plus loin encore : non seulement les gouvernants peuvent voir leur responsabilité mise en cause devant les mêmes juridictions que les autres citoyens, mais ils encourent en outre des peines aggravées pour certaines infractions, notamment en cas d’atteinte à la probité.

Les principes sur lesquels est bâti le système juridique républicain ne peuvent être plus clairs : dans une société démocratique, où chaque personne est en droit d’exiger non seulement la pleine jouissance de ses droits, mais d’une façon générale, l’application de la loi, nul ne peut prétendre bénéficier d’un régime d’exception – les élus moins encore que les autres. C’est parce que nous avons l’assurance que leurs illégalismes seront sanctionnés effectivement, de la même façon que les autres citoyens et sans attendre une bien hypothétique sanction électorale, qu’ils et elles peuvent véritablement se dire nos représentantes et représentants.

Longtemps, cette exigence d’égalité juridique est cependant restée largement théorique. Reprise en main et placée dans un rapport de subordination plus ou moins explicite au gouvernement, sous le Premier Empire (1804-1814), la magistrature est demeurée sous l’influence de l’exécutif au moins jusqu’au milieu du XXe siècle. C’est pourquoi, jusqu’à la fin du siècle dernier, le principe d’égalité devant la loi va se heurter à un singulier privilège de « notabilité » qui, sauf situations exceptionnelles ou faits particulièrement graves et médiatisés, garantit une relative impunité aux membres des classes dirigeantes dont la responsabilité pénale est mise en cause. Il faut ainsi garder à l’esprit que la figure « du juge rouge », popularisée dans les médias à la fin des années 1970, vient stigmatiser des magistrats uniquement parce qu’ils ont placé en détention, au même titre que des voleurs de grand chemin, des chefs d’entreprise ou des notaires.

La donne ne commence à changer qu’à partir du grand sursaut humaniste de la Libération qui aboutit, entre autres, à la constitution d’un corps de magistrats recrutés sur concours, bénéficiant à partir de 1958 d’un statut relativement protecteur et d’une école de formation professionnelle spécifique, l’École nationale de la magistrature. Ce corps se dote progressivement d’une déontologie exigeante, favorisée notamment par la reconnaissance du syndicalisme judiciaire en 1972. Ainsi advient une nouvelle génération de juges qui, désormais, prennent au sérieux la mission qui leur est confiée : veiller en toute indépendance à la bonne application de la loi, quels que soient le statut ou la situation sociale des personnes en cause.

C’est dans ce contexte que survient ce qui était encore impensable quelques décennies plus tôt : la poursuite et la condamnation des notables au même titre que le reste de la population. Amorcé, comme on l’a dit, au milieu des années 1970, le mouvement prend de l’ampleur dans les décennies suivantes avec la condamnation de grands dirigeants d’entreprises, comme Bernard Tapie, puis de figures politiques nationales, à l’image d’Alain Carignon ou de Michel Noir, députés-maires de Grenoble et de Lyon. La condamnation d’anciens présidents de la République à partir des années 2010 – Jacques Chirac en 2011, Nicolas Sarkozy une première fois en 2021 – achève de normaliser cette orientation ou, plutôt, de mettre fin à l’anomalie démocratique consistant à réserver un traitement de faveur aux élus et, plus largement, aux classes dirigeantes.

Procédant d’abord d’une évolution des pratiques judiciaires, ce mouvement a pu également s’appuyer sur certaines modifications du cadre juridique. Ainsi de la révision constitutionnelle de février 2007 qui consacre la jurisprudence du Conseil constitutionnel suivant laquelle le président de la République ne peut faire l’objet d’aucune poursuite pénale durant l’exercice de son mandat, mais qui permet la reprise de la procédure dès la cessation de ses fonctions. On peut également mentionner la création, en décembre 2013, du Parquet national financier qui, s’il ne bénéficie pas d’une indépendance statutaire à l’égard du pouvoir exécutif, a pu faire la preuve de son indépendance de fait ces dernières années.

C’est précisément contre cette évolution historique qu’est mobilisée aujourd’hui la rhétorique de « la tyrannie des juges ». Une rhétorique qui vise moins à défendre la souveraineté du peuple que celle, oligarchique, des gouvernants.

The Conversation

Vincent Sizaire ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.

ref. Nicolas Sarkozy condamné à 5 ans de prison : une normalisation démocratique ? – https://theconversation.com/nicolas-sarkozy-condamne-a-5-ans-de-prison-une-normalisation-democratique-266101

El espejo robótico: ¿tan buenos somos los humanos como para querer copias?

Source: The Conversation – (in Spanish) – By Nagore Osa Arzuaga, Docente e investigadora en Innovación en Diseño Industrial, con especialización en Diseño de Interacción Humano-Robot y Factores Humanos, Mondragon Unibertsitatea

El parecido humano en los robots atrae la atención, pero también revela el punto ciego del antropocentrismo tecnológico. Andy Kelly / Unsplash, CC BY

Los robots llenan titulares cuando nos imitan: conversan con voz casi humana, escriben textos que parecen nuestros o “leen” emociones en una pantalla. Pero el salto que de verdad importa es otro: dejar de copiarnos y empezar a complementarnos, diseñando capacidades que tapen nuestras grietas –atención, sesgos, fatiga– y valorar a las máquinas por su impacto en las personas, no por lo humanas que parecen.

Robótica cognitiva: ¿humanos digitales?

Tal vez, si nos hablan de robótica cognitiva, no sepamos qué es eso. Pero seguro que hemos leído alguna noticia sobre Neuralink, la empresa de Elon Musk que busca conectar cerebros y ordenadores; o hemos visto androides que bailan torpemente, o conocemos a los robots de Amazon, que recorren los almacenes cargados de paquetes.

La robótica cognitiva busca dar a la máquina algo más que fuerza y precisión: habilidades parecidas a las de los humanos o animales. No se trata solo de mover motores y sensores, sino de que sean capaces de percibir, recordar, aprender, anticipar y adaptarse cuando las cosas cambian. Su objetivo es pasar del “robot que repite” al que entiende el contexto. Para ello, integra disciplinas como la inteligencia artificial, las ciencias cognitivas o la biología.

Drones y prótesis cerebrales

Dentro de este campo, hay estrategias muy distintas. Por ejemplo, la robótica de enjambre, inspirada en hormigas o abejas, estudia cómo robots simples logran juntos lo que uno solo no podría: desde drones que se coordinan en rescates hasta máquinas que se reparten las tareas en un almacén.

En cambio, enfoques como la neurorrobótica o la robótica del desarrollo buscan imitar la cognición humana, modelando el cerebro o copiando los mecanismos de aprendizaje de los niños.

Lo que empezó como un reto académico ya está saliendo de los laboratorios. Vemos robots sociales en aulas y hospitales, enjambres en la industria logística o prótesis que se controlan con señales cerebrales. Avanzan, pero con límites: aún dependen de entornos controlados, carecen de “sentido común” y no generalizan bien lo aprendido.

Sin embargo, en nuestra investigación elegimos mirar en otra dirección. Más allá de resolver cada obstáculo técnico, nos preguntamos: ¿realmente nos beneficia que los robots se parezcan tanto a nosotros?

El punto ciego del antropocentrismo

Durante décadas, hemos medido el progreso en robótica con una vara muy concreta: ¿se parece a nosotros? Hablamos de visión “al nivel humano”, razonamiento “casi humano” o manos “tan hábiles como las nuestras”. Pero el parecido no garantiza impacto en contextos reales de trabajo o cuidado. Y ahí está la trampa: si aspiramos a copiar lo humano, también copiamos sus límites.

Pensemos en tareas críticas que requieren vigilancia constante: los humanos fallamos de forma predecible. La atención sostenida se desploma con el tiempo y el rendimiento cae. No podemos olvidar que el humano se cansa.

A esto se suman nuestros sesgos cognitivos. Tendemos a confirmar hipótesis previas, a confiar demasiado cuando una herramienta acierta a menudo, o a ignorar señales contradictorias. El resultado es conocido: errores evitables y decisiones poco fiables.

Un estudio reciente que analiza de forma sistemática los campos de la robótica cognitiva y la colaboración humano-robot apunta en la misma dirección: los temas de cognición (aprendizaje, predicción, intención) y de colaboración (tarea, control, seguridad, confianza) crecen, pero lo hacen por caminos separados. Los hallazgos clave indican que, si bien los avances en la robótica cognitiva han dado lugar a sistemas más autónomos y adaptables, integrarlos eficazmente en las prácticas de colaboración con las personas sigue siendo un desafío.

Si el estándar de éxito de un robot cognitivo es “parecerse a nosotros”, acabará heredando exactamente las mismas debilidades. Y si hablamos de colaboración con las personas, no parece el escenario más deseable.

La imagen del robot humanoide está anclada en el imaginario colectivo, pero puede no ser una solución tan necesaria en la práctica.
Taiki Ishikawa / Unsplash, CC BY

De heredar fallos a solucionarlos

Si copiar lo humano hace que el robot herede nuestras debilidades, cambiemos la pregunta: ¿qué puede aportar un robot precisamente por no ser humano? Partamos de nuestras limitaciones reales –lapsos de atención, sesgos, fatiga, carga física…– y dotemos a los robots de capacidades que las compensen.

A esta misma inquietud han respondido otros autores con propuestas como la de los “superpoderes robóticos”, desarrollada por Robin Neuhaus, experto en diseño de interacción humano-robot.

Hay tres familias de “superpoderes” fáciles de visualizar. Los físicos implican que los robots no sienten dolor ni fatiga y mantienen precisión constante. Sirven, por ejemplo, para sostener tareas largas y delicadas sin perder pulso ni ritmo.

Por otro lado, los cognitivos les dotan de cero necesidad de competir, paciencia infinita y foco sostenido. Pueden repetir una instrucción cien veces sin molestarse, pedir confirmación cuando algo es ambiguo y no tomar nada como algo personal.

Por último, tenemos los superpoderes comunicativos: se expresan sin ambigüedades ni dobles lecturas, no discriminan ni se ofenden. Traducen estados internos complejos en mensajes simples y oportunos (qué pasa, por qué importa y cuál es el siguiente paso), lo que elimina malentendidos.

Esquema de los ‘superpoderes’ robóticos o esas capacidades en que los robots podrían cubrir nuestras debilidades.
Nagore Osa Arzuaga.

De espejo a aliado

Este giro no es tanto tecnológico como metodológico. Las piezas técnicas ya existen. Falta cambiar el punto de partida: diseñar desde las personas, no desde el potencial de la máquina. Solo así podremos hablar de robots realmente centrados en el humano: definidos por nuestras necesidades, codiseñados con quienes los usan y evaluados por su impacto en el trabajo, la seguridad y el bienestar.

Seguir construyendo espejos solo hereda nuestras grietas. Si queremos robots que de verdad nos ayuden, cambiemos el guion: un superpoder útil antes que un “casi humano” brillante.

The Conversation

Las personas firmantes no son asalariadas, ni consultoras, ni poseen acciones, ni reciben financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y han declarado carecer de vínculos relevantes más allá del cargo académico citado anteriormente.

ref. El espejo robótico: ¿tan buenos somos los humanos como para querer copias? – https://theconversation.com/el-espejo-robotico-tan-buenos-somos-los-humanos-como-para-querer-copias-265121