In his show Last Week Tonight (full episode here), John Oliver says "If it seems like everyone's talking about AI, it's because they are."
This page will explore ways church leaders can use artificial intelligence in constructive and ethical ways to enhance their ministries and decolonize the church. It will also explore potential dangers and how to keep yourself safe!
Most of these topics are covered in my webinar with Derek Terry on the Open and Affirming Coalition, which I recommend you check out. A full transcript and accompanying slides are available at the link below.
It can be difficult to find royalty-free paintings or depictions of biblical stories that feature darker-skinned people, LGBTQ+ people, women, and people with disabilities. Using artificial intelligence, you can paint a new picture. Representation matters! The label on the image here was cut off, but the image was generated with AI using the text prompt "a dark-skinned 1st-Century Judean man who is walking on water reaches out to another 1st-Century Judean man who is trying to walk on water too but who is getting distracted by the waves around him, watercolor."
A major drawback of using AI: while your art may shift folks' picture of who Jesus was and could be, you are not supporting artists, including artists of color.
I encourage you to hire local artists, when possible (and pay them fairly!). You may also consider using liturgical resources like A Sanctified Art, which supports the art and poetry of women and people of color. Have a conversation with your leadership team about how you want to use the gifts of your church to support artists from the global majority!
The cover image for this project was created using Adobe Firefly, a free image-generating program that creates images from text prompts. Unlike other AI art-generating programs, Adobe only uses royalty-free images that already exist in their stock.
Make sure that if you use AI-generated art you include some indication that the artwork was created with AI (note the Adobe Firefly tag in the bottom left of many of the images in this project). As new programs emerge, do your research to make sure they are not stealing from artists without their permission!
AI is created by human beings and therefore reflects human biases.
For example, when I asked for "A Protestant pastor preaching in a church," Adobe Firefly gave me 4 options: all were able-bodied men.
It is also interesting that Firefly has a clear sense of what "church" means, and it's a brick-and-mortar building (with empty pews!)!
Prompts like "A female pastor" or "A pastor with a cane" may direct AI to draw the picture you are looking for, but it is clear the biases of our existing culture, artwork and literature impact AI generative art.
Biases in AI are not limited to art. For example, facial recognition software is less likely to recognize brown faces and medical AI programs designed to detect skin cancer are also less likely to detect cancer on darker skin (the majority of medical images it draws from depict pale skin).
Chatbots like Chat GPT, Bard (now Gemini), and Goblin Tools, are enthusiastic collaborators who are ready to help you with your latest challenge! They can spark ideas, point you to resources, help you manage your workflow, and possibly save you time.
Chatbots are excellent for creative problem-solving, including the work of decolonization if used in conjunction with meaningful conversations with community partners.
See the pdf below for how Chat GPT and Gemini responded to the prompts "We have a new trans member joining the church. What are some ways we could make trans members feel more welcome and included?" and "I'd like to do some decolonizing work in my church. What are some places I might start?" Gemini notably included links to useful resources where I could learn more. I was blown away by their responses!
Chatbots are not human nor are they sentient. They recognize patterns and synthesize existing data. Just like generative AI for art, they can exhibit biases. Most programs include a "report" feature to point that out and help improve the program.
They are generally NOT good for researching facts (they will make things up), nor are they a good source of legal advice. See this article, for example, that discusses the way chatbots are providing inaccurate election misinformation.
Chatbots are also not a substitute for genuine human interaction!
They do come in handy for drafting (not finalizing) guidelines, writing challenging emails, creating meaningful object lessons for children, reimagining Bible stories through unexpected lenses, and even writing computer code or Excel formulas to help you balance your budget.
Like all tools, they require a deft human hand to operate. Never copy and paste something from a chatbot directly--make sure you are adding a human touch (and human critical thinking!) every step of the way.
Spam phone calls, con artists, and people photoshopping images to create a false narrative are nothing new. What is new is the way people are using AI to generate more convincing photos, video, and sounds.
There are now machine-learning tools that reproduce human voices using short clips of existing audio. Most that are available to the public are still unconvincing; unfortunately, they are improving.
In a Senate hearing on AI, Gary Schildhorn testified that a scammer impersonated his son's voice and tried to extort him for money. Mr. Schildhorn caught the scam, but the caller was convincing.
It is time NOW to talk to your friends, colleagues, and congregants about this possibility. My family has a code word we use for phone calls, especially phone calls involving money.
How will your church members know a voicemail is from you? What can they do if the call seems fishy? How are you educating your congregation about how to avoid scams?
Video cloning software is being sold as a tool corporations can use to create training videos without actually having to film. Companies can create an "avatar" of a person and then give them a script to read (in hundreds of languages!).
On the plus side, this may mean that one day, you can film a sermon or lecture and post it in Spanish and Korean, even if you don't speak those languages. On the negative side, it is increasingly a possibility that someone may create an avatar of you using publicly-available sound or images to create fake videos.
In the video, the clone is unconvincing, but the technology is improving. How will you let folks know what is real....?
AI Art is getting better, but it is still poor at replicating some human features, hands in particular! Hands are difficult to draw for human artists too! Look at images closely to spot things like too many fingers, thumbs in the wrong place, crossed eyes, missing earlobes, and more.
As tools improve, AI-generated content will be more difficult to spot. It is vital that you research and fact-check any image or story you see, especially if you see it on a social media platform like Facebook!
Using a decolonial lens, we must consider issues of control and transparency when it comes to AI algorithms.
Currently, the algorithms and datasets for ChatGPT and Gemini are NOT open source. They are proprietary. In other words, while we see the inputs (our prompts) and the outputs (their response), we have no idea how the programs get from A to B.
This makes it incredibly difficult to identify biases in the training data. It also prevents users from understanding how their data is being used to train the chatbots, raising potential privacy concerns.
On the other hand, chatbot companies hire thousands of people to monitor and correct their algorithms and fix detected biases (whether they are hiring diverse employees to do this work is another story--Google's Gemini said "due to the privacy and confidentiality surrounding specific hiring practices, I cannot disclose specific details about the team's composition or quotas").
It could be argued that if there were more transparency, bad actors might steal the code and create their own bots to use for nefarious purposes.
How transparent chatbots should be about their datasets and algorithms is not a simple topic! Below you will find how Gemini sorted it out. I asked Gemini to discuss with me how historical figures and decolonial theorists might weigh in on the issue. You can read their response below.
Church of the Good Shepherd, UCC | Albuquerque, NM
Pacific School of Religion | Berkeley, CA
Copyright © 2024 Disorganized Religion, All Rights Reserved.
Powered by GoDaddy
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.