A few weeks ago, I used Chat GPT for the first time to help me think through a personal matter to do with an ongoing relational issue. I use AI a bit in my professional life, but I’ve never tried it before for personal and faith-related issues. In fact, I was apprehensive in doing so, because many have argued that AI cannot replicate human relationship and it risks short-circuiting personal and spiritual formation.
But I’m aware that increasingly people are turning to AI for advice (the top current use of AI is therapy and companionship) and it would be naïve to assume that Christians are not swept up in this trend. I therefore thought it would be helpful to have first-hand experience – not with an agenda to discredit it, but to test genuinely if it might be beneficial.
Starting with prayer
I’m aware that a particular danger for me is I like to know things and I spend a lot of my time researching and reading, but I am less good at protecting time to know God and coming to him in prayer.
So before using Chat GPT, the very first thing I did was to pray:
Lord, I am aware of my tendency to rely on myself, and gravitate towards knowing things rather than knowing you. Please help this experiment with AI to be a way of being more dependent on you and knowing you better.
I then considered carefully what to ask Chat GPT (this is called a prompt), taking into account awareness of some particular weaknesses with AI.
Context matters
Chat GPT and similar tools are poor at inferring context. This means they don’t understand the unspoken background or assumptions that humans use to make sense of the world. They simply predict the most probable next word in a sequence based on a massive trawl of existing information on the internet and other sources.
For example, unless you specify that you are a committed Christian seeking to live by biblical principles, Chat GPT won’t reflect that context in its response. Nor will it reflect all the other things that shape you and your situation, some of which you might not even be conscious of.
In that way, using AI is in fact a poor substitute for protecting the time to develop deep Christian friendships especially in the local church – relationships that understand your history, background and current context. The advice such friends will be able to offer is imperfect, yes, but naturally much more contextually rich because it comes from personal encounters and shared experience.
For AI to do anything like the same requires us to share detailed context in our prompts – the sort of detail that likely only comes from the self-awareness generated by many human conversations with friends who care over the years. For me, this meant not only sharing that I was a committed Christian seeking to obey the Bible, but also summarising a personal history of my own character strengths and likely blind-spots, and a well-balanced, not overly biased reflection about others involved in the situation, before requesting any specific advice.
Purpose matters
AI lacks moral or ethical understanding – it merely reflects human values and biases from its training. As such, it needs to be prompted explicitly about desired end goals, otherwise it will simply mirror the goals and purposes embedded in the vast amount of information on the internet.
It takes some wisdom to identify a desirable goal. As humans, we find it easier to specify sub-goals – I want to find a life partner, I want to get a better job, I want to manage a conflict effectively – rather than more ultimate goals such as we want to glorify God and live in a way that pleases him.
I added into my prompt that whilst I was looking for some specific advice on a situation, the bigger picture was that I wanted to deal with my situation with grace and truth, in a way that depended on Jesus who gives me strength rather than depending on my own strength.
Be aware of the mirror effect
I discovered that Chat GPT was extremely (almost disconcertingly) validating and affirming. I can see why people might gravitate to it for therapy. It appeared to sincerely empathise with my circumstances and complimented me several times on my thoughtfulness….presumably because it could see that this mattered to me from my prompts. I also have no doubts that it would have reinforced my desire for power if I had indicated this was my priority, or perhaps (more troubling) further encouraged me in unhealthy ways of thinking if in some way I was determined to be very hard on myself.
In short, it mirrored what I wanted without necessarily challenging those desires, and I think that this is what makes AI at best a potential instrument but not a friend. AI won’t love me or want the best for me. It will simply do what I ask it to do because – as is argued in the excellent book on AI ‘Made in Our Image’ by Stephen Driscoll – it is made in our image rather than in God’s image. If I wanted in some way, explicitly or even unconsciously, to be validated in a way that was contrary to God’s word, it would have been more than happy to do so. I was reminded of the warning in 2 Timothy 4:3-4 about seeking advice that scratches our itching ears – we will hear what we want to hear; we will read what we want to read.
Of some value, with the right safeguards
Of course, these warnings could equally be sounded about books, podcasts, blogs, and the advice of certain people. So with all the above in mind, what did I make of the advice that AI provided? Truthfully, there were a few things that were pleasantly surprising. Because of my stated desire to remain dependent on Jesus, Chat GPT provided me with prayers and Bible verses that reminded me to re-anchor in Christ rather than self-effort. Taking into account the context I had provided, it gently challenged me to reframe how I was interpreting the behaviour of others through Jesus’s eyes. For a couple of pieces of advice, I questioned Chat GPT about where in Scripture that advice was supported and it provided further and in my opinion appropriate, biblical references.
I spent some time praying through the suggested prayers so that I was engaging with the advice on a relational level and not just intellectually, and asked that God would take away anything that was unhelpful. I also committed to share what I had discovered afterwards with a Christian friend, so we could reflect together on whether it was faithful advice. Using AI is certainly no quick fix and requires an enormous amount of thought, attention and cross-checking.
Would I use it again? I’m not sure. Certainly only with continued care. Any advice from AI is only as good as what we ask of it and how we evaluate it. Of first priority for every believer is the need to protect the time to engage with God’s word ourselves (not outsourcing the task to AI!) so that we know his heart for his people, shaping our desires. We need to become intimately familiar with his voice so we can discern if any advice we receive is in keeping with his word. We need to be investing in Christian relationships that help us know ourselves, be known, and do the same for others, so that there is the right context for wisdom.
As a final note, the situation that I was seeking advice for changed overnight in an unexpected way – the other parties involved were moved to behave differently, so I didn’t in fact need to deploy the specific advice I was given by Chat GPT. It was a reminder that we need to leave room for God’s grace to work in unexpected ways.
Perhaps the great value of my experiment was more the time spent praying to the Lord and growing in dependence on him, than any wisdom that AI was able to impart.
Disclaimer: this article was written without any input from AI – imperfect certainly, but from the heart with love.
