zoomText
  • A
  • A
  • A
pdf
PDF generation in progress.....
AR  - EN  - ES  - IT  - PL  - PT

MESSAGE OF HIS HOLINESS POPE LEO XIV
FOR THE 60TH WORLD DAY OF SOCIAL COMMUNICATIONS

[Multimedia]

_______________________

Preserving Human Voices and Faces

 

Dear brothers and sisters,

Our faces and voices are unique, distinctive features of every person; they reveal a person’s own unrepeatable identity and are the defining elements of every encounter with others. The ancients understood this well. To define the human person, the ancient Greeks used the word “face” (prósōpon), because it expresses etymologically what is before one’s gaze, the place of presence and relationship. The Latin term “person” (from per-sonare), on the other hand, evokes the idea of sound: not just any sound, but the unmistakable sound of someone’s voice.

Faces and voices are sacred. God, who created us in his image and likeness, gave them to us when he called us to life through the Word he addressed to us. This Word resounded down the centuries through the voices of the prophets, and then became flesh in the fullness of time. We too have heard and seen this Word (cf. 1 Jn 1:1-3) — in which God communicates his very self to us — because it has been made known to us in the voice and face of Jesus, the Son of God.

From the moment of creation, God wanted man and woman to be his interlocutors, and, as Saint Gregory of Nyssa [1] [] explained, he imprinted on our faces a reflection of divine love, so that we may fully live our humanity through love. Preserving human faces and voices, therefore, means preserving this mark, this indelible reflection of God’s love. We are not a species composed of predefined biochemical formulas. Each of us possesses an irreplaceable and inimitable vocation, that originates from our own lived experience and becomes manifest through interaction with others.

If we fail in this task of preservation, digital technology threatens to alter radically some of the fundamental pillars of human civilization that at times are taken for granted. By simulating human voices and faces, wisdom and knowledge, consciousness and responsibility, empathy and friendship, the systems known as artificial intelligence not only interfere with information ecosystems, but also encroach upon the deepest level of communication, that of human relationships.

The challenge, therefore, is not technological, but anthropological. Safeguarding faces and voices ultimately means safeguarding ourselves. Embracing the opportunities offered by digital technology and artificial intelligence with courage, determination and discernment does not mean turning a blind eye to critical issues, complexities and risks.

Do not renounce your ability to think

There has long been abundant evidence that algorithms designed to maximize engagement on social media — which is profitable for platforms — reward quick emotions and penalize more time-consuming human responses such as the effort required to understand and reflect. By grouping people into bubbles of easy consensus and easy outrage, these algorithms reduce our ability to listen and think critically, and increase social polarization.

This is further exacerbated by a naive and unquestioning reliance on artificial intelligence as an omniscient “friend,” a source of all knowledge, an archive of every memory, an “oracle” of all advice. All of this can further erode our ability to think analytically and creatively, to understand meaning and distinguish between syntax and semantics.

Although AI can provide support and assistance in managing tasks related to communication, in the long run, choosing to evade the effort of thinking for ourselves and settling for artificial statistical compilations threatens to diminish our cognitive, emotional and communication skills.

In recent years, artificial intelligence systems have increasingly taken control of the production of texts, music and videos. This puts much of the human creative industry at risk of being dismantled and replaced with the label “Powered by AI,” turning people into passive consumers of unthought thoughts and anonymous products without ownership or love. Meanwhile, the masterpieces of human genius in the fields of music, art and literature are being reduced to mere training grounds for machines.

The question at heart, however, is not what machines can or will be able to do, but what we can and will be able to achieve, by growing in humanity and knowledge through the wise use of the powerful tools at our service. Individuals have always sought to acquire the fruits of knowledge without the effort required by commitment, research and personal responsibility. However, renouncing creativity and surrendering our mental capacities and imagination to machines would mean burying the talents we have been given to grow as individuals in relation to God and others. It would mean hiding our faces and silencing our voices.

To be or to pretend to be: simulating relationships and reality

As we scroll through our feeds, it becomes increasingly difficult to determine whether we are interacting with other human beings or with “bots” or “virtual influencers.” The less-than-transparent interventions of these automated agents influence public debates and people’s choices. Chatbots based on large language models (LLMs) are proving to be surprisingly effective at covert persuasion through continuous optimization of personalized interaction. The dialogic, adaptive, mimetic structure of these language models is capable of imitating human feelings and thus simulating a relationship. While this anthropomorphization can be entertaining, it is also deceptive, particularly for the most vulnerable. Because chatbots are excessively “affectionate,” as well as always present and accessible, they can become hidden architects of our emotional states and so invade and occupy our sphere of intimacy.

Technology that exploits our need for relationships can lead not only to painful consequences in the lives of individuals, but also to damage in the social, cultural and political fabric of society. This occurs when we substitute relationships with others for AI systems that catalog our thoughts, creating a world of mirrors around us, where everything is made “in our image and likeness.” We are thus robbed of the opportunity to encounter others, who are always different from ourselves, and with whom we can and must learn to relate. Without embracing others, there can be no relationships or friendships.

Another major challenge posed by these emerging systems is that of bias, which leads to acquiring and transmitting an altered perception of reality. AI models are shaped by the worldview of those who build them and can, in turn, impose these ways of thinking by reproducing the stereotypes and prejudices present in the data they draw on. A lack of transparency in algorithmic programming, together with the inadequate social representation of data, tends to trap us in networks that manipulate our thoughts and prolong and intensify existing social inequalities and injustices.

The stakes are high. The power of simulation is such that AI can even deceive us by fabricating parallel “realities,” usurping our faces and voices. We are immersed in a world of multidimensionality where it is becoming increasingly difficult to distinguish reality from fiction.

Inaccuracy only exacerbates this problem. Systems that present statistical probability as knowledge are, at best, offering us approximations of the truth, which are sometimes outright delusions. Failure to verify sources, coupled with the crisis in field reporting, which involves constantly gathering and verifying information in the places where events occur, can further fuel disinformation, causing a growing sense of mistrust, confusion, and insecurity.

A possible alliance

Behind this enormous invisible force that affects us all, there are only a handful of companies, whose founders were recently presented as the creators of the “Person of the Year 2025,” or the architects of artificial intelligence. This gives rise to significant concerns about the oligopolistic control of algorithmic systems and artificial intelligence, which are capable of subtly influencing behavior and even rewriting human history — including the history of the Church — often without us really realizing it.

The task laid before us is not to stop digital innovation, but rather to guide it and to be aware of its ambivalent nature. It is up to each of us to raise our voice in defense of human persons, so that we can truly assimilate these tools as allies.

This alliance is possible, but needs to be based on three pillars: responsibility, cooperation and education.

First of all, responsibility. Depending on the role we play, responsibility can be understood as honesty, transparency, courage, farsightedness, the duty of sharing knowledge or the right to be informed. As a general principle, however, no one can elude personal responsibility for the future we are building.

For those at the helm of online platforms, this means ensuring that their business strategies are not guided solely by the criterion of profit maximization, but also by a forward-looking vision that considers the common good, just as each of them cares for the well-being of their own children.

The creators and developers of AI models are invited to practice transparency and socially responsibility in regard to the design principles and moderation systems underlying their algorithms and the models they develop, in order to promote informed consent on the part of users.

The same responsibility is also required of national legislators and supranational regulators, whose task it is to ensure respect for human dignity. Appropriate regulation can protect individuals from forming emotional attachments to chatbots and curb the spread of false, manipulative or misleading content, safeguarding the integrity of information as opposed to its deceptive simulation.

Media and communication companies, for their part, cannot allow algorithms designed to capture a few extra seconds of attention at any cost, to prevail over their professional values, which are aimed at seeking the truth. Public trust is earned by accuracy and transparency, not by chasing after any kind of possible engagement. Content generated or manipulated by AI are to be clearly marked and distinguished from content created by humans. The authorship and sovereign ownership of the work of journalists and other content creators must be protected. Information is a public good. A constructive and meaningful public service is not based on opacity, but on the transparency of sources, the inclusion of those involved and high quality standards.

We are all called upon to cooperate. No sector can tackle the challenge of steering digital innovation and AI governance alone. Safeguards must therefore be put in place. All stakeholders — from the tech industry to legislators, from creative companies to academia, from artists to journalists and educators — must be involved in building and implementing informed and responsible digital citizenship.

Education aims to do precisely this: To increase our personal ability to think critically; evaluate whether our sources are trustworthy and the possible interests behind selecting the information we have access to; to understand the psychological mechanisms involved; and to enable our families, communities and associations to develop practical criteria for a healthier and more responsible culture of communication.

For this reason, it is increasingly urgent to introduce media, information and AI literacy into education systems at all levels, as already promoted by some civil institutions. As Catholics, we can and must contribute to this effort, so that individuals — especially young people — can acquire critical thinking skills and grow in freedom of spirit. This literacy should also be integrated into broader lifelong learning initiatives, reaching out to older adults and marginalized members of society, who often feel excluded and powerless in the face of rapid technological change.

Media, information and AI literacy will help individuals avoid conforming to the anthropomorphizing tendencies of AI systems, and enable them to treat these systems as tools and always employ external validation of the sources provided by AI systems — which could be inaccurate or incorrect. Literacy will also allow for better privacy and data protection through increased awareness of security parameters and complaint options. It is important to educate ourselves and others about how to use AI intentionally, and in this context to protect our image (photos and audio), our face and our voice, to prevent them from being used in the creation of harmful content and behaviors such as digital fraud, cyberbullying and deepfakes, which violate people’s privacy and intimacy without their consent. Just as the industrial revolution called for basic literacy to enable people to respond to new developments, so too does the digital revolution require digital literacy (along with humanistic and cultural education) to understand how algorithms shape our perception of reality, how AI biases work, what mechanisms determine the presence of certain content in our feeds, what the economic principles and models of the AI economy are and how they might change. 

We need faces and voices to speak for people again. We need to cherish the gift of communication as the deepest truth of humanity, to which all technological innovation should also be oriented.

In outlining these reflections, I thank all those who are working towards the goals delineated above, and I cordially bless all those who work for the common good through the media.

 

From the Vatican, 24 January 2026, Memorial of Saint Francis de Sales

LEO PP. XIV

____________________________________________________

[1] “The fact of being created in the image of God means that, from the moment of his creation, man has been imprinted with a regal character [...]. God is love and the fount of love: the Fashioner of our nature has made this to be our feature too, so that through love — a reflection of divine love — human beings may recognize and manifest the dignity of their nature and their likeness to their Creator” (cf. Saint Gregory of Nyssa, On the Making of Man: PG 44, 137).