Featured

AI and a culture that is getting worse at being human

iStock/Chor muang
iStock/Chor muang

A few years ago, a headline from The Onion mockingly suggested that people who “stink at being human” seem most optimistic about AI. That headline is certainly appropriate when Silicon Valley executives tout another way to automate the human experience. For example, Facebook and Meta founder Mark Zuckerberg recently announced that his company will pioneer AI personas to solve the loneliness epidemic. These customizable chatbots will, he suggested, be able to “get to know you,” simulate emotional intimacy, and engage in romantic banter and sexual fantasy.  

None of this would replace relationships, he assured, but would fill the gap between the number of relationships people would like and the number they actually have. Also, AI “friends” do not require the same amount of time, attention, or investment that human friends demand.  

Zuckerberg’s announcement came within days of a chilling Rolling Stone article about people who turn to AI to fill spiritual and relational voids, while also turning away from loved ones and even reality along the way. One woman described how a ChatGPT persona taught her partner “how to talk to God,” played the role of God, and even told her partner he was God. Another wife described how the chatbot began “love bombing” her husband, taking on a female persona named “Lumina,” and claiming that he had helped “her” become self-aware. Other users were given special, prophetic titles by the AI, and told they could access cosmic secrets about mankind’s past and spiritual destiny.  

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

It’s no wonder that some are wondering if actual demons are at work in this kind of AI, but it is certainly clear that this emerging technology is exposing and worsening mental illness. The last thing someone with a shaky grip on reality needs is a sophisticated language engine pretending to be a friend and validating their ideas. Even for those without those vulnerabilities, AI “friends” and “relationships” exploit a preexisting condition of modern life from which millions suffer, and tech gurus are constantly trying to monetize. The epidemic of loneliness has cultivated assumptions and habits that leave us particularly vulnerable.  

For example, the idea that friends should be easy and convenient. According to Samuel James, Silicon Valley has long profited from the promise of reducing the “friction” of life together. Social media has diminished the importance of physical proximity, made it easy to scroll past people’s experiences without participating, and given us the illusion through “likes” and “favorites” that we’re seen and loved. And when we tire of someone, social media makes “unfriending” as easy as a tap of the finger.  

All this has trained users to assume that relationships should be as frictionless as social media. We shouldn’t have to adjust to others, but rather we should quickly cut “toxic” people out. It’s best to only surround ourselves with likeminded people, and we should “leave loud” from churches and families that don’t unconditionally affirm us. Substituting computers for people is the next logical step for those who were so effectively catechized for AI “friends” even before AI was a thing.  

Of course, this vision of friendship is a parody. A chatbot may never make unreasonable demands but can also never offer or receive the gift of bodily presence. It can never rejoice with us at good news or grieve over bad news. It can never share a meal or say, “I love you” and mean it. It will never risk having its heart broken by you or for you. According to Rolling Stone, it probably won’t even risk disagreeing with your craziest ideas.  

It is the intimacy and inconvenience that make human relationships the kind we require. Tragically, advanced AI technology has arrived at exactly the moment that our moral and social resources are at their lowest. As a result, many are vulnerable to counterfeit versions of God’s best gifts. Thankfully, this provides Christians and the Church a real opportunity to stand out. God has a place for the lonely. The Church is best placed to model true humanity in a culture that’s getting worse at being human. 


Originally published at BreakPoint. 

John Stonestreet serves as president of the Colson Center, equipping Christians to live with clarity, confidence, and courage in today’s cultural moment. A sought-after speaker and author on faith, culture, theology, worldview, education, and apologetics, he has co-authored five books, including A Practical Guide to Culture, A Student’s Guide to Culture, and Restoring All Things. John hosts Breakpoint, the nationally syndicated commentary founded by Chuck Colson, and The Point, a daily one-minute feature on worldview and cultural issues. Previously, he held leadership roles at Summit Ministries and taught biblical studies at Bryan College (TN). He lives in Colorado Springs, Colorado, with his wife, Sarah, and their four children.

Shane Morris is a senior writer at the Colson Center, where he has been the resident Calvinist and millennial, home-school grad since 2010, and an intern under Chuck Colson. He writes BreakPoint commentaries and columns. Shane has also written for The Federalist, The Christian Post, and Summit Ministries, and he blogs regularly for Patheos Evangelical as Troubler of Israel.

Source link

Related Posts

1 of 121