1. The XXVI General Assembly of the Pontifical Academy for Life in February 2020
In its Instruction on Christian Freedom and Liberation, the Catholic Church’s Congregation for the Doctrine of Faith (CDF) sheds light on the risks of pairing technological prowess with mastery over the natural world:
As technology gains an ever-greater control of nature, it threatens to destroy the very foundations of our future in such a way that mankind living today becomes the enemy of the generations to come. By using blind power to subjugate the forces of nature, are we not on the way to destroying the freedom of the men and women of tomorrow? What forces can protect man from the slavery of his own domination? A wholly new capacity for freedom and liberation, demanding an entirely renewed process of liberation, becomes necessary. The liberating force of scientific knowledge is objectively expressed in the great achievements of technology. Whoever possesses technology has power over the earth and men. As a result of this, hitherto unknown forms of inequality have arisen between those who possess knowledge and those who are simple users of technology. The new technological power is linked to economic power and leads to a concentration of it. Thus, within nations and between nations, relationships of dependence have grown up which within the last twenty years have been the occasion for a new claim to liberation. How can the power of technology be prevented from becoming a power of oppression over human groups or entire peoples?
The Church here warns against the dangers of rendering dominion as all-out lordship over the material world (ourselves included) and that this kind of mindset will ultimately result in humans falling victim to their own hubris and domination. In addition, the CDF goes on to identify the threat of an individualism that leads to the unjust distribution of (limited) resources, as well as of new forms of oppression, slavery, and inequality; the application of technical expertise to acts of genocide is mentioned explicitly in this
Instruction (
Congregation for the Doctrine of Faith 1986, n. 11–19).
When it comes to artificial intelligence, the Church recognizes that it—like so many other technologies—carries both immense promise and potential peril; there is, at the very least, concern that AI could have a harmful impact on human–human and human–nature relationships. Pope Francis praised the Pontifical Academy for Life for having selected “The ‘Good’ Algorithm? Artificial Intelligence, Ethics, Law, and Health” as the major theme of its XXVI General Assembly, noting that “it is not enough simply to trust in the moral sense of researchers and developers of devices and algorithms. There is a need to create intermediate social bodies that can incorporate and express the ethical sensibilities of users and educators” (
Francis 2020, para. 8). The Church sees itself as an important discussion partner here.
As a corresponding member of the Pontifical Academy for Life, I attended this General Assembly in Vatican City at the end of February 2020. Permit me to quickly highlight a number of important points raised over the course of the three-day event.
Jen Copestake—technology reporter and senior producer for BBC Click—showed an interesting videoclip called “Can AI Beat a Doctor?”, which begs a rather perennial question in the debate over AI: should we (or why should we) give preference to
human healthcare professionals (who are limited and fallible in judgement) if their functions (especially vis-à-vis diagnostics) can be better performed by AI software (
Copestake 2018)? The clip explores the increasing role of AI to help render healthcare universally accessible and affordable as healthcare resources dwindle despite the rising need of longer-living populations. The focus here is on the diagnostic capability of digital healthcare providers. Copestake peers into the use of Babyl in Rwanda, which is the largest digital health service provider in the country boasting more than two million registered users.
Its parent company—Babylon—made the (highly contested) claim, in 2018, that its AI had the ability to diagnose certain issues as well as (or better than) human healthcare providers and that its software could pass a medical exam with a higher average grade than a human person (the CEO—Ali Parsa—explains that “once a machine learns something, it never forgets”) (
Copestake 2018). The company describes its use of AI as follows: “A committed team of research scientists, engineers, doctors and epidemiologists are working together to develop and optimise Babylon’s AI abilities. Much of the team’s work is on the development of cutting edge machine learning research; this is being driven through access to large volumes of data from the medical community, continual learning from our own users and through feedback from Babylon’s own experts” (
Babylon 2020).
In the clip, some users of Babyl celebrated an accessibility to healthcare services that they had only dreamed of before the technology’s debut in the country. Others lamented an enduring exacerbation of the divide between the rich and the poor since, among other things, the service depends on the use of a feature phone. Rwanda’s then Minister for Health—Diane Gashumba—also pointed out the popular digital health service’s lack of country-specific data; for example, Babyl—designed from a UK perspective—largely (and problematically) neglected the possibility of malaria when evaluating users’ symptoms.
Although the transmissibility of malaria is of little concern to residents of the UK, it continues to be a major public health issue in Rwanda. We are reminded here that “when humans craft or create something,” as Jordan Joseph Wales explains, ”they reconfigure matter and its potentialities according to human imagination and purpose” (
Green et al. 2022, p. 16), which are sometimes limited by (or to) the innovator’s specific context and scope of experience. The example raised above speaks to a need for greater cross-cultural, international cooperation in AI development and application (see
ÓhÉigeartaigh et al. 2020), which Pope Francis has called for repeatedly.
Some commentators at the General Assembly emphasized the importance of remembering that behind every software is an investor, so the user is a consumer ipso facto. That is, conversations about the benefits and new possibilities promised by artificial intelligence, which are myriad, must be tempered by the recognition of AI as big business.
Others discussed an overall malaise with the datafication of human behavior and with the reduction of human beings to mere data suppliers. The issue raised here concerns the process of collecting and reading data from patient populations; participants underlined the implicit profiling that happens when gathering data, the possibility of firms enticing poorer countries with financial compensation in return for medical data, the need for humans to read the correlations such data produce, and the importance of being able to identify instances where algorithms are replicating or amplifying prejudices, assumptions, or biases that may have been programmed—consciously or not—into them.
While some signaled a potential erosion of the patient–healthcare professional relationship when AI allows both patients and physicians to have access to the same diagnostic tools, others highlighted that collaborative (instead of replacement/substitution) models of human–machine interaction have already shown evidence of being able to outperform healthcare directed exclusively by humans. Interestingly, a number of commentators predicted that if AI software could successfully commit to a first round of collecting data, verifying medical history, and recording the symptoms of a patient, then human healthcare providers would be freed from this, as it were, giving them more time to sit with the patient rather than peering over their computer screen while typing in information in a time-compressed setting. That is, in this way, AI could be focused on particular tasks, while the healthcare provider could be more thoroughly engaged in his or her encounter with the patient.
2. The Rome Call for AI Ethics1
At the close of the General Assembly—just days before the Italian government would implement nationwide quarantine measures due to the COVID-19 outbreak—academicians, dignitaries, and a host of guests crowded into the Auditorium Conciliazione, just a stone’s throw from Saint Peter’s Basilica. One could not miss the giant screen upstage nor the word “renAIssance” repeatedly projected upon it, undoubtedly to underline the parallels between momentous periods in human history—the Renaissance and the age, as it were, of AI—marked by humanism, innovation, and imagination. An impressive line-up of speakers was introduced: Msgr. Vincenzo Paglia (president of the Pontifical Academy for Life), Mr. Brad Smith (president of Microsoft), Mr. John Kelly III (executive vice president of IBM), Mr. David Sassoli (president of the European Parliament), and Mr. Qu Dongyu (director general of the Food and Agriculture Organization). This roster was peculiar, considering that the event was being held on the heels of a primarily ecclesial meeting on artificial intelligence.
Msgr. Paglia delivered words penned for the event by Pope Francis, who could not attend because of illness. In the address, the pope spoke of “the digital galaxy, and specifically artificial intelligence,” as being “at the very heart of the epochal change we are experiencing,” which is transforming the way we think about space, time, and the human body (
Francis 2020, paras. 2, 4). He underlined how decisions made in medical, economic, and social contexts increasingly reveal “the point of convergence between an input that is truly human and an automatic calculus” (
Francis 2020, para. 2). One cannot ignore how “a simple ideological calculation of functional performance and sustainable costs” could dismiss the biographical dimension of humanhood in favor of a mechanistic view (
Francis 2020, para. 5). Further, the pope urged caution regarding how “algorithms now extract data that enable mental and relational habits to be controlled, for commercial or political ends, frequently without our knowledge. This asymmetry, by which a select few know everything about us while we know nothing about them, dulls critical thought and the conscious exercise of freedom” (
Francis 2020, para. 4). We add to these concerns those mentioned in the previous section.
Although the risk to deepen the divide between the haves and the have-nots through the steering of knowledge, wealth, and power into the hands of but a few must not go unchallenged, the pope made plain that while new technologies are neither neutral nor value-free, one must also not lose sight of their immense potential. Applauding the bringing together of persons from the Church, industry, politics, and science in (what appeared to be) a public commitment to the Common Good, Pope Francis proposed that “the ethical development of algorithms—algor-ethics—can be a bridge enabling those principles [of the Church’s social teaching] to enter concretely into digital technologies through an effective cross-disciplinary dialogue” (
Francis 2020, para. 10).
Smith and Kelly spoke of a “new generation of opportunity.” They touted AI as perhaps the most powerful tool in the world, boasting incredible promise as well as a host of new challenges (including the weaponization of certain technologies, the link between AI and cyberattacks, AI and the fueling of mass surveillance, the automation of jobs, etc.). Smith called the Catholic Church a fundamental voice in the ethics of emerging technologies; he commended the invitation set out by the
Rome Call for AI Ethics for encouraging the inclusion of a plurality of voices in discussions on AI, while underscoring the importance of the humanities, liberal arts, and ethics alongside STEM disciplines for the acquiring of a more complex and integrative set of skills. “The future of humanity,” Smith concluded, “depends on us making this right.” Kelly echoed these sentiments, reminding that AI is very much a reflection of us as human beings, certainly to the extent that AI “learns” based on the data and processes that we choose to give it. He pressed that our view ought not to be human
versus machine, as is popular fodder for the movie industry, but both human
and machine working together for the democratization of knowledge and in pursuit of the Common Good. At the end of his presentation, Kelly cited a line from Pope Paul VI that was pronounced in the Angelus of 20 July 1969 a few hours before the first moon landing: “The human heart absolutely must become freer, better and more religious as machines, weapons and the instruments people have at their disposition become more powerful” (
Paul VI 1969, para. 2).
Many of these points raised here by the speakers are featured in the
Rome Call for AI Ethics, a document (or a declaration of sorts) that seeks to engage AI “movers and shakers”—in the Church, industry, NGOs, public institutions, politics—in committing to serious ethical reflection regarding the development and applications of artificial intelligence (
Pontifical Academy for Life et al. 2020). Although spearheaded by Church leaders and later submitted to the Secretary of State of the Holy See for approval, the
Rome Call is not an official publication of the Pontifical Academy for Life, but it is the fruit of some of the world’s leading experts (including a number of members from the Academy) on the subject of AI.
At the end of the event, the sponsors of the Rome Call (Msgr. Vincenzo Paglia, Brad Smith, John Kelly III, Qu Dongyu, and Paola Pisano) officially became the first signatories, publicly expressing:
their desire to work together, in this context and at a national and international level, to promote ‘algor-ethics,’ namely the ethical use of AI as defined by the following principles: (1) Transparency: in principle, AI systems must be explainable; (2) Inclusion: the needs of all human beings must be taken into consideration so that everyone can benefit and all individuals can be offered the best possible conditions to express themselves and develop; (3) Responsibility: those who design and deploy the use of AI must proceed with responsibility and transparency; (4) Impartiality: do not create or act according to bias, thus safeguarding fairness and human dignity; (5) Reliability: AI systems must be able to work reliably; (6) Security and privacy: AI systems must work securely and respect the privacy of users.
The charge at present is to properly elucidate and elaborate on the principles of the Rome Call; to deliberate on how said principles might seriously and constructively influence policymaking and the development of AI at the industry level (especially when the upholding of these principles may appear to stand in the way of industrial innovation and productivity); and to increase this solidarity as we move ever so quickly into a new generation of opportunity.
3. Technology, Stewardship, and Ecological Solidarity
The
Rome Call makes plain that the promotion of technology must not only be for the benefit of humanity, but also of the planet writ large. This requires, then, that our understanding of solidarity here be more
ecological (or perhaps less
anthropocentric) in scope, recognizing the interdependencies of human and nonhuman organisms in the natural world (
Pontifical Academy for Life et al. 2020, pp. 2–4).
Now more than ever, we must guarantee an outlook in which AI is developed with a focus not on technology, but rather for the good of humanity and of the environment, of our common and shared home and of its human inhabitants, who are inextricably connected. In other words, a vision in which human beings and nature are at the heart of how digital innovation is developed …
In his encyclical letter,
Caritas in Veritate,
Benedict XVI (
2009, n. 69) writes that “technology is never merely technology. It reveals man and his aspirations towards development, it expresses the inner tension that impels him gradually to overcome material limitations.
Technology, in this sense, is a response to God’s command to till and to keep the land (cf. Gen 2:15) that he has entrusted to humanity, and it must serve to reinforce the covenant between human beings and the environment, a covenant that should mirror God’s creative love”. For the Church, technology—of whatever form—must be at the service of human beings; it can never be the other way around. As such, it is to be an expression of stewardship and service. In addition, technology must contribute to genuine progress, which, for the Church, is not simply an increase in human mastery over material existence, but must necessarily lead human beings “to exercise a wider solidarity” (
Paul VI 1971, n. 41) and make concerted efforts to attenuate inequalities across the board (
Francis 2019, para. 10). Accordingly, technological development and application must be rooted in, and directed toward, respect for the inherent dignity of human beings and all natural environments, paying close attention to the delicate complexity of ecosystems and the interdependencies spelled out within them (
Green et al. 2022). In many ways, the first signatories of the
Rome Call for AI Ethics—voices of the Church, industry, and government alike—agree on these points. In the document, we read:
In order for technological advancement to align with true progress for the human race and respect for the planet, it must meet three requirements. It must include every human being, discriminating against no one; it must have the good of humankind and the good of every human being at its heart; finally, it must be mindful of the complex reality of our ecosystem and be characterised by the way in which it cares for and protects the planet (our “common and shared home”) with a highly sustainable approach, which also includes the use of artificial intelligence in ensuring sustainable food systems in the future.
This wider
ecological solidarity, then, goes beyond simply recognizing interdependencies; it promotes “a vision in which human beings are part of the social-ecological community and have a responsibility, a moral duty, to understand and develop actions according to their impacts on the components of that community” (
Mathevet et al. 2018, p. 608).
In his message for the World Day of Peace, Saint
John Paul II (
1990) proclaimed that the ecological crisis was a global
moral issue, and he made an urgent appeal for a new interpretation of solidarity not unlike the one discussed above (n. 10). He spoke of “the need for concerted efforts aimed at establishing the duties and obligations that belong to individuals, peoples, States and the international community,” making plain that world peace is not only threatened by war, conflict, and the many injustices among peoples, but also by “a lack of due respect for nature” (n. 1, 15). Twenty years later,
Benedict XVI (
2010, n. 6, 8, 12, 14) would go on to argue that the fostering of this widened solidarity—which broadens the breadth of our care beyond humanhood—remains imperative and that it is, in fact, our
duty to do so.
4. Laudato Si’ and the Technocratic Paradigm
Pope Francis’s second encyclical, Laudato Si’, brought the concerns of his predecessors to the fore. Although the mandate to protect, tend, till, and guard the earth—spelled out in the first pages of Genesis (2.15)—has been an important theme in Catholic teaching, I think it is fair to say that it never gained the attention of the lay faithful in the way that it has during the pontificate of Francis.
Care of Creation is one principle among many of the Church’s social doctrine. However,
Laudato Si’ ultimately shows that one cannot think about it as separate from the others. That is, solidarity, the inherent dignity of human persons, the rights of workers and the dignity of work, the Common Good and our responsibility to participate in community, human rights and obligations to self and others, and the preferential option for the poor have bearing on, and in turn are affected by, the health of the planet. Indeed, “everything is connected” is the most common refrain of the encyclical (
Francis 2015, n. 6, 16, 42, 91, 117, 240). As an example, Pope Francis writes that “the earth herself, burdened and laid waste, is among the most abandoned and maltreated of our poor” (
Francis 2015, n. 2), reframing our sense of poverty and vulnerability in the context of this widening vision of solidarity promoted by John Paul II and Benedict XVI before him.
References to technology abound in the text. Here,
Laudato Si’ shares the concerns expounded by the Pontifical Council for Justice and Peace in the
Compendium of the Social Doctrine of the Church regarding the unraveling of the human–nature relationship. As we have said, the Church marvels at technological innovation inasmuch as it promotes the good of humanity (
Pontifical Council for Justice and Peace 2004, n. 6). Although humans were created with the explicit mandate to till and to tend, one cannot ignore that Genesis also speaks of a God who commands humankind to subdue the earth “and have dominion over the fish of the sea and over the birds of the air and over every living thing that moves” upon it (Gen. 1.28). This divine order “to subject to himself the earth and all that it contains” (
Pontifical Council for Justice and Peace 2004, n. 456), although it must never be thought of as absolute nor despotic, welcomes some degree of human manipulation of the natural world.
Far from thinking that works produced by man’s own talent and energy are in opposition to God’s power, and that the rational creature exists as a kind of rival to the Creator, Christians are convinced that the triumphs of the human race are a sign of God’s grace and the flowering of His own mysterious design. In this regard, the Magisterium has repeatedly emphasized that the Catholic Church is in no way opposed to progress, rather she considers “science and technology are a wonderful product of a God-given human creativity, since they have provided us with wonderful possibilities […]. For this reason, “as people who believe in God, who saw that nature which he had created was ‘good’, we rejoice in the technological and economic progress which people, using their intelligence, have managed to make”.
Even though the
Compendium looks with measured hope to technology as a potentially powerful tool to solve a number of global problems, such as hunger and disease (
Pontifical Council for Justice and Peace 2004, n. 458), Pope Francis—who does not reject this—tempers the optimism here by arguing that an investment in more science and more technology may be missing the point. Joining his voice with that of the Patriarch Bartholomew, the pope pleads for “a change of humanity; otherwise we would be dealing merely with symptoms. He [Bartholomew] asks us to replace consumption with sacrifice, greed with generosity, wastefulness with a spirit of sharing, an asceticism which ‘entails learning to give, and not simply to give up. It is a way of loving, of moving gradually away from what I want to what God’s world needs. It is liberation from fear, greed and compulsion’” (
Francis 2015, n. 9; see also
n. 60).
Laudato Si’ begins where the Compendium ends its discussion on technology, with a warning that has become a lament:
Man […] must never forget that “his capacity to transform and in a certain sense create the world through his own work ... is always based on God’s prior and original gift of the things that are.” He must not “make arbitrary use of the earth, subjecting it without restraint to his will, as though it did not have its own requisites and a prior God-given purpose, which man can indeed develop but must not betray.” When he acts in this way, “instead of carrying out his role as a co-operator with God in the work of creation, man sets himself up in place of God and thus ends up provoking a rebellion on the part of nature, which is more tyrannized than governed by him”.
Mechanizing, reducing, atomizing, and instrumentalizing nature have distorted the call to stewardship (to till and to tend) into an exercise of unconditional dominion over the natural world; the Church sees this to be the root of our current ecological crisis (
Pontifical Council for Justice and Peace 2004, n. 461–65).
It is from here that Pope Francis takes up his critical assessment, in
Laudato Si’, of the dominant technocratic paradigm of our day that exalts mastery and control above relationality, responsibility, and accountability, forsaking the value of limitation altogether (
Francis 2015, chp. 3). In this framework, the connection between humankind and nature is not covenantal, but confrontational (
Francis 2015, n. 106). “Those who are surrounded with technology,” the pope writes, “’know full well that it moves forward in the final analysis neither for profit nor for the well-being of the human race,’ that ‘in the most radical sense of the term power is its motive—a lordship over all’” (
Francis 2015, n. 108).
Further, the hyperspecialization of technologies may lead to a reduction of the complexity of the ecological crisis in order to divide it into manageable parts that could be dealt with without paying heed to how they are interrelated or interdependent. There needs to be a more comprehensive and integrative way of looking at things, the pontiff suggests, “otherwise, even the best ecological initiatives can find themselves caught up in the same globalized logic” (
Francis 2015, n. 110–11). I have in mind here the double-edged sword that AI-based systems can be in the context of the environment. On the one hand, artificial intelligence and machine learning can be exceedingly valuable tools: they can help optimize energy generation and use; they can be used to monitor invasive or endangered species, pollution levels, changes to land, and air quality; and they can identify patterns in the extent of Arctic sea ice or track coral bleaching, for instance. And yet, these same AI-based systems (including AI hardware, digital infrastructure, and data centers) can also have a significant carbon footprint; can produce substantial levels of greenhouse gas emissions; and can generate considerable electric and electronic waste.
In spite of the emphasis, in
Laudato Si’, on the importance of understanding, as a bare minimum, that everything is connected and in spite of the plea to widen our communal sense of solidarity, it is interesting to note that the Church stops short of espousing, or championing the adoption of, a biocentric or ecocentric worldview (
Pontifical Council for Justice and Peace 2004, n. 463;
Francis 2015, n. 118). According to the Church, doing so might risk an “egalitarian consideration of the ‘dignity’ of all living beings” that could very well efface the ontological and axiological differences between humans and all others (
Pontifical Council for Justice and Peace 2004, n. 463). At the end of the day, the Catechism of the Catholic
Church (
1993) makes clear that human beings are the summit of God’s creative work (n. 343). This said,
Laudato Si’ entirely rejects an “excessive,” “misguided,” or “tyrannical” anthropocentrism that denies the dignity and integrity of Creation as a whole, and is, therefore, unconcerned for other creatures (n. 68, 116, 118, 119).
5. Concluding Remarks
It is crucial that we draw out of the Church’s vision something that could easily get lost in the oft prophetic (if not salvific) rendering of the scope and potential of artificial intelligence to change the world and to tackle every imaginable problem confronting humankind: the value of limitation. Discourse on technological innovation is more often than not about the surpassing (or obliteration) of limitation in a way that might bring us to new heights of performance and well-being.
Laudato Si’ reminds that the Biblical mandate to subdue the earth and have dominion over it is both
entrusted and
limited. As John Paul II declared in
Laborem Exercens: “The word of God’s revelation is profoundly marked by the fundamental truth that
man, created in the image of God,
shares by his work in the activity of the Creator and that, within the limits of his own human capabilities, man in a sense continues to develop that activity, and perfects it as he advances further and further in the discovery of the resources and values contained in the whole of creation” (
John Paul II 1981, n. 25).
Human beings are called to imitate God in their work. Likewise, they are also called to imitate God in their rest. It is not superfluous that, in Genesis (2.2–3), the Sabbath immediately follows the creation of humankind. Exodus (20.8–11) and Leviticus (23.3) go on to make plain that “ceasing” from work and allowing the land to rest from our use are not gestures of leisure, kindness, or reward, but are obedience to divine commandment. The “excessive” anthropocentrism that Pope Francis cautions against can lead to an “excessive” use of the natural world and an enthrallment with “the possibility of limitless mastery over everything” (
Francis 2015, n. 224). In his critique of the technocratic paradigm that is pivotal to
Laudato Si’, the pope argues that the confrontational relationship between human beings and nature—in many ways perceived as one between “master” and “servant”—“has made it easy to accept the idea of infinite or unlimited growth, which proves so attractive to economists, financiers and experts in technology. It is based on the lie that there is an infinite supply of the earth’s goods, and this leads to the planet being squeezed dry beyond every limit” (
Francis 2015, n. 106). The Church and the sponsors of the
Rome Call for AI Ethics seem to recognize this (although I cannot be sure how tech giants will translate the above call to limitation and rest). They warn that the ostensibly limitless potential of AI must not distract us from taking pause to think critically and constructively about risk, inequality, discrimination, the dignity of work, and the technologization of humankind.
In his book, Faith and Doubt: Studies in Traditional Jewish Thought, Rabbi Norman Lamm writes:
Perhaps the most powerful expression of the Bible’s concern for man’s respect for the integrity of nature as the possession of its Creator, rather than his own preserve, is the Sabbath […]. The six workdays were given to man in which to carry out the commission to “subdue” the world, to impose on nature his creative talents. But the seventh day is a Sabbath; man must cease his creative interference in the natural order (the Halakhah’s definition of melakhah or work), and by this act of renunciation demonstrate his awareness that the earth is the Lord’s and that man therefore bears a moral responsibility to give an accounting to its Owner for how he has disposed of it during the days he ‘subdued’.
Like the Jewish tradition, Catholicism applauds the fruit of human genius in technological innovation, but it also teaches the importance of restriction in our interference with the natural world and the value of limitation in our dominion (that is, our stewardship) over it. A widened ecological solidarity moves beyond a vision of care that is fixed solely on humanhood and seeks to account for other creatures in the ecosystem, not only as resources or means, but as having a dignity of their own by virtue of their createdness.
Technologies, including AI-based systems, have an impact on the way human beings relate to each other and to the environment. If pursuits in AI and other technologies are meant to extend the scope of human mastery (by programming machines and robots, for instance, as tools to achieve such a thing), the Church will measure progress not based on how much more control we have harnessed, but on whether ecological solidarity has been enhanced in our duty to care for human beings and the natural world.