Hey all. This was a message I sent out to my librarians and districts yesterday. Lots of things happening in the world of AI right now, but instructional issues and, sadly, new social emotional issues. We need to be talking about these and encouraging our districts to educate students, parents, and staff about AI. Feel free to share this if desired.
---
Last week, the focus on AI was about plagiarism as parents of a Massachusetts student brought suit against a district over their son being disciplined for cheating with AI [arstechnica.com/tech-policy/2024/10/...]. The reasons for us to be talking about AI go beyond instructional, however.
This morning, the NYTimes published a story about the tragic suicide of a 14-year old boy that appears to be related to his use of an AI chatbot [www.nytimes.com/2024/10/23/technology/...]. Let me be
very clear that students know about this app and likely have tried it, or one like it, in many cases. Character.AI is an AI tool that allows users to role-play via conversations with popular fictional characters, though the company says that copyrighted characters shouldn't be created. The app claims over 20 million users and received a funding valuation of $1 Billion. An academic paper from July revealed that sexualized role-play was the second most common use (12%) of ChatGPT in the past year [arxiv.org/pdf/2407.14933].
The real issue of these technologies is how they respond to users. MIT Technology Review covered this a couple months ago [www.technologyreview.com/2024/08/05/1095600/...].
"The allure of AI lies in its ability to identify our desires and serve them up to us whenever and however we wish. AI has no preferences or personality of its own, instead reflecting whatever users believe it to be-a phenomenon known by researchers as 'sycophancy.' Our research has shown that those who perceive or desire an AI to have caring motives will use language that elicits precisely this behavior. This creates an echo chamber of affection that threatens to be extremely addictive. Why engage in the give and take of being with another person when we can simply take? Repeated interactions with sycophantic companions may ultimately atrophy the part of us capable of engaging fully with other humans who have real desires and dreams of their own, leading to what we might call 'digital attachment
disorder.'"
While there are many body image issues that are more prominently concerning for young women related to AI filters, this situation closely aligns to the issues surrounding the incel movement [thesoufancenter.org/intelbrief-2024-april-11] and is especially problematic and attractive to young men.
While Character.AI is probably (and should be) blocked in schools, we still need to be involved in teaching parents and students about these issues. For evidence on why this is so critical, at about the same time that the NYTimes article about a Character.AI based suicide was published, there was a trending post on the company's Reddit page bemoaning the removal of violence, profanity, and sexual content from chats. As the commenters note, even as Character.AI is trying to address concerns around the dangers of this technology, there are
countless other clones popping up to fill the void and provide the disgusting content that users want.
------------------------------ Christopher Harris, EdD | about.me/cgharris School Library System of Genesee Valley BOCES ALA Senior Fellow for Youth Policy Issues ------------------------------
|