The London Institute for Contemporary Christianity

Never miss a thing!

 

Artificial intelligence: building for joy

What is the probability of artificial intelligence running amok and causing havoc in an extreme way?

This idea has increasingly been hitting the headlines and was given a serious place at the table of world politics with the first international AI Safety Summit last week.

Regulation is obviously worth pursuing, but what about the values embodied by AI itself? Rather than focusing on prohibition, how can we proactively rig the system for good? AI trains on data sets, and these data sets are shaped by human behaviour – meaning sin is built into the algorithms. The result is that people are manipulated, commodified, and drawn into patterns of behaviour that are addictive, mindless, or even malicious.

Instead of simply accepting that we suffer from the ‘sin of the parents’ (Exodus 20:5) the question of whether specific virtues should be deliberately placed into the data sets has begun to be addressed. The Christian philosopher Dr Rachel Siow Robertson and her colleagues have been looking at how we can engage more fruitfully online by focusing on joy. Their working definition of joy is ‘an intense feeling of fulfilment and a deep alignment between some good in the world, and oneself and others’.

This principle has been used to develop an alternative framework for testing technology called MIIND, with criteria motivation, such as creativity, and promotes healthy integration with the world, self, and others. What is the user’s intensity of experience? How is the product normative (i.e. establishing moral or aesthetic norms), and does it enable users to recognise they’re dependent on external factors for wellbeing?

Other measures of the impact of technology on individuals are somewhat ‘thin’, looking at short-term satisfaction and the ‘stickiness’ of applications such as news apps that can keep users doom scrolling for hours. MIIND, however, helps developers look at ‘thick’ user experiences that enable people to pay loving attention to others. For example, virtual reality can be used to raise awareness of issues in a way that leads to hopeful action rather than capitalising on attention-grabbing stories of suffering.

Jesus modelled a way of relating to people that recognised them as individuals worth paying attention to, helping them to grow in their unique character and capabilities. So let’s ask a new question: how can we be a voice and a catalyst for realistic practical action to both use and create AI-based technologies to build lasting joy?

Dr Ruth Bancewicz
Church Engagement Director, The Faraday Institute for Science and Religion, Cambridge

Comments

  1. This subject sounds interesting. I’m just completing an assignment on Theory of Mind (ToM) which includes a question on Artificial Intelligence and the various types of methods used to simulate the human mind.

    By William Graham  -  17 Nov 2023
  2. This is really helpful, do you have a link to Dr Rachel Siow Robertson’s work?

    By Dan  -  17 Nov 2023
  3. I would like to follow this discussion.

    By Maureen Ellis  -  17 Nov 2023
  4. I think there is so much potential to embed good values into these systems from the start in avariety of ways, such curation of the data used to train AI systems.
    One nice example of making small changes that could have strong positive impact was a parents group asked amazon to update/upgrade the way one interacts with Alexa so that when children interact with it, they are encouraged to ask for things politely and say thank you etc. That’s a quite a small thing, but an example of how we should be thinking about embedding good values and behaviours at every steps in development of technology.

    By Stephen Haddad  -  18 Nov 2023
  5. This is a fascinating conversation, and it very pertinent to the current developments and voices in the field of AI. We were built to thrive on relationship, we are communal beings and our flourishing is tied to the nature of those relationships, both in a positive and negative sense. The question is can we build lasting joy into AI technologies? So far we haven’t really succeeded in building lasting joy into anything much at all, have we? Joy is not an anomalous concept its an experience that we share with each other, its relational, isn’t it? The thing we lack in our communities is any sense of joy, it’s all about negativity, conflict and lack of cultivating relationships with others. I know this sounds very negative, but I wonder if this is what our communities look like, how are we ever going to “build lasting joy into AI technologies? What are your thoughts?

    By Mark Jeffery  -  24 Nov 2023
    • I guess it depends which communities you’re looking at, and how you’re assessing them. Of course it’s always going to be mixed, but if we intentionally create and seek communities for ourselves where joy and the values outlines by Rachel and her colleagues re talking about, can we be more confident that people in those communities can influence or be part of the tech world in a positive way?

      By Ruth Bancewicz  -  22 Jan 2024

Leave a comment

Your email address will not be published. Required fields are marked *

X