<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Issue 6 &#8211; The Hilltop Monitor</title>
	<atom:link href="https://hilltopmonitor.jewell.edu/category/volume-40/issue-6/feed/" rel="self" type="application/rss+xml" />
	<link>https://hilltopmonitor.jewell.edu</link>
	<description>The Official Student Publication of William Jewell College</description>
	<lastBuildDate>Thu, 07 May 2026 17:16:11 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>An Aurora on the Hill</title>
		<link>https://hilltopmonitor.jewell.edu/an-aurora-on-the-hill/</link>
					<comments>https://hilltopmonitor.jewell.edu/an-aurora-on-the-hill/#respond</comments>
		
		<dc:creator><![CDATA[Matthew Parker]]></dc:creator>
		<pubDate>Mon, 17 Nov 2025 21:54:36 +0000</pubDate>
				<category><![CDATA[Arts & Culture]]></category>
		<category><![CDATA[Features]]></category>
		<category><![CDATA[Issue 6]]></category>
		<category><![CDATA[Multimedia]]></category>
		<category><![CDATA[Volume 40]]></category>
		<category><![CDATA[aurora]]></category>
		<category><![CDATA[Gano clock tower]]></category>
		<category><![CDATA[jewell]]></category>
		<category><![CDATA[matthew]]></category>
		<category><![CDATA[matthew parker]]></category>
		<category><![CDATA[northern lights]]></category>
		<category><![CDATA[plc]]></category>
		<guid isPermaLink="false">https://hilltopmonitor.jewell.edu/?p=20606</guid>

					<description><![CDATA[]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="1024" height="577" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_223847-1-1024x577.jpg" alt="" class="wp-image-20612" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_223847-1-1024x577.jpg 1024w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_223847-1-800x450.jpg 800w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_223847-1-768x432.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_223847-1-1536x865.jpg 1536w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_223847-1-2048x1153.jpg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption">The Northern Lights over the north side of the Quad</figcaption></figure>



<figure class="wp-block-image size-large"><img decoding="async" width="1024" height="577" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224318-1-1024x577.jpg" alt="" class="wp-image-20613" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224318-1-1024x577.jpg 1024w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224318-1-800x450.jpg 800w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224318-1-768x432.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224318-1-1536x865.jpg 1536w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224318-1-2048x1153.jpg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption">This year, the northern lights were seen as far south as Florida.</figcaption></figure>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img decoding="async" width="577" height="1024" data-id="20614" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-577x1024.jpg" alt="" class="wp-image-20614" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-577x1024.jpg 577w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-282x500.jpg 282w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-768x1364.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-865x1536.jpg 865w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-1153x2048.jpg 1153w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224824-1-scaled.jpg 1441w" sizes="(max-width: 577px) 100vw, 577px" /><figcaption class="wp-element-caption">Auroras happen when charged particles from the sun hit Earth&#8217;s upper atmosphere</figcaption></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="577" height="1024" data-id="20615" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-577x1024.jpg" alt="" class="wp-image-20615" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-577x1024.jpg 577w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-282x500.jpg 282w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-768x1364.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-865x1536.jpg 865w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-1153x2048.jpg 1153w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/20251111_224939-1-scaled.jpg 1441w" sizes="auto, (max-width: 577px) 100vw, 577px" /><figcaption class="wp-element-caption">Photo Credit: Matthew Parker</figcaption></figure>
</figure>
]]></content:encoded>
					
					<wfw:commentRss>https://hilltopmonitor.jewell.edu/an-aurora-on-the-hill/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>The Troubling Rise of AI &#8220;Performers&#8221;</title>
		<link>https://hilltopmonitor.jewell.edu/the-troubling-rise-of-ai-performers/</link>
					<comments>https://hilltopmonitor.jewell.edu/the-troubling-rise-of-ai-performers/#respond</comments>
		
		<dc:creator><![CDATA[Ethan Naber]]></dc:creator>
		<pubDate>Mon, 17 Nov 2025 21:40:18 +0000</pubDate>
				<category><![CDATA[Arts]]></category>
		<category><![CDATA[Arts & Culture]]></category>
		<category><![CDATA[Issue 6]]></category>
		<category><![CDATA[Volume 40]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[arts]]></category>
		<category><![CDATA[arts and culture]]></category>
		<category><![CDATA[chatgpt]]></category>
		<category><![CDATA[department of performing arts]]></category>
		<category><![CDATA[ethics]]></category>
		<category><![CDATA[naber]]></category>
		<guid isPermaLink="false">https://hilltopmonitor.jewell.edu/?p=20602</guid>

					<description><![CDATA[Through the past two years, artificial intelligence (AI) has threatened to replace every facet of humanity that it can. It has forced writers to change&#8230; ]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-medium"><img loading="lazy" decoding="async" width="400" height="500" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash-400x500.jpg" alt="" class="wp-image-20603" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash-400x500.jpg 400w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash-819x1024.jpg 819w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash-768x960.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash-1229x1536.jpg 1229w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash-1638x2048.jpg 1638w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/andrea-de-santis-zwd435-ewb4-unsplash.jpg 1920w" sizes="auto, (max-width: 400px) 100vw, 400px" /><figcaption class="wp-element-caption">Photo by <a href="https://unsplash.com/@santesson89?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Andrea De Santis</a> on <a href="https://unsplash.com/photos/black-and-white-robot-toy-on-red-wooden-table-zwd435-ewb4?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Unsplash</a>.</figcaption></figure>



<p>Through the past two years, artificial intelligence (AI) has threatened to replace every facet of humanity that it can. It has forced writers to change the way they write, teachers to change the way they write, and eliminated position after position in the real world. Yet AI has so far been unable to touch the performing arts sectors: we still need singers, instrumentalists, cast, crew, and the various things that make performances a form of art.</p>



<p>A new AI “actress,” Tilly Norwood, represents the first serious challenge.</p>



<p>You’d be forgiven for thinking that there’s no chance for AI to succeed in this area: the first film to be entirely written by generative AI—“Post Truth,” which released earlier this year—claimed to get a lot of press attention despite being an awful lot of nothing. Review aggregators Metacritic and Rotten Tomatoes don’t list any reviews for the film. An AI film “starring” Norwood, “AI Commissioner,” similarly fell flat. A <a href="https://www.theguardian.com/film/2025/sep/30/tilly-norwood-ai-actor-hollywood">review</a> from the <em>Guardian</em> described Norwood’s performance as “someone whose perfect teeth keep blurring into a single white block in their mouth” being used to “deliver sloppily written, woodenly delivered dialogue.”</p>



<p>Other forays into AI generated audio or performative art have similarly fallen flat. AI “rapper” FN Meka began using anti-Black language within two weeks of signing with CMG and subsequently got <a href="https://www.nytimes.com/2022/08/23/arts/music/fn-meka-dropped-capitol-records.html">dropped by the label</a>. The human behind the AI’s voice—it wasn’t <em>just </em>an AI after all—<a href="https://www.rollingstone.com/music/music-features/fn-meka-controversy-ai-1234585293/">wasn’t fairly compensated</a> for his work in the endeavor.</p>



<p>Norwood’s “existence,” if one can even call it that, represents serious ethical concerns. AI talent studio Xiocia is considering <a href="https://deadline.com/2025/09/talent-agent-ai-actress-tilly-norwood-studios-1236557889/">signing the computer program</a>, a process usually reserved for flesh-and-blood actors. It is indeed quite telling that The Industry’s first foray into “AI talent” is not a normal kinda-okay-looking-if-you-squint actor. Instead, the first AI making the rounds with talent agencies claims to be a teenage or early twenties girl designed to steal eyes.</p>



<p>It should not surprise anyone that the first AI created for acting purposes is designed to be sexualized. Artificial intelligence programs are not designed to push back against their sexualization, as a human performer might do. Fiona Sturges of the <em>Independent </em><a href="https://www.the-independent.com/arts-entertainment/films/features/tilly-norwood-ai-actor-movies-b2837979.html">sums it up nicely</a>: “Here is an actor who will not set unreasonable terms for her employment. She won’t insist on a script that passes the Bechdel test, or on financial parity with her male co-stars. There will be no need for insurance, or stunt safety, or intimacy coordinators.” I question Sturges’s use of the pronoun “she” for a computer program, but concede that people have been using these pronouns for other computer programs (e.g., Siri, Alexa).</p>



<p>This is the next logical move for AI businesses: use their infinite-content-generation-machine to sell sex, or things that look like it. OpenAI is <a href="https://www.koreatimes.co.kr/lifestyle/trends/20251027/openais-move-to-allow-adult-content-in-chatgpt-triggers-global-ethical-debate">actively loosening ethical standards</a> and enabling users to generate AI “erotica for verified adults” (read: pornography). Deepfake technology already exists, and has been used to generate <a href="https://19thnews.org/2025/07/deepfake-ai-kids-schools-laws-policy/">sexually explicit images of children</a>. Grok has “spicy mode,” whatever that means. Adults have <a href="https://www.newsnationnow.com/business/tech/ai/man-propose-ai-girlfriend-bored/">already tried to propose</a> to ChatGPT’s voice chatbots even before it loosened erotic restrictions.&nbsp;</p>



<p>If you’re an AI company looking to make a quick buck, exploiting human loneliness sounds like a great way to do that. You don’t even have to worry about regulation, because there isn’t any! (The Trump administration repealed <a href="https://www.federalregister.gov/documents/2023/11/01/2023-24283/safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence">Biden&#8217;s EO</a> on safe AI.)</p>



<p>Artificial intelligence is a tool: when used properly, it can automate things that are tedious, or that humans don’t feel like doing. But it should not come at the cost of human interaction or involvement, and should be built with safety and informed consent in mind. I want an AI to fold my laundry so I can work on artistic pursuits, not the other way around.</p>



<p>I do not want Hollywood executives telling a computer program masquerading as a barely-adult “actress” how to behave. Humans make art; programs do not.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://hilltopmonitor.jewell.edu/the-troubling-rise-of-ai-performers/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Smallest Hill: Let’s Stop Allowing Child Labor in the Form of Child Actors</title>
		<link>https://hilltopmonitor.jewell.edu/smallest-hill-lets-stop-allowing-child-labor-in-the-form-of-child-actors/</link>
					<comments>https://hilltopmonitor.jewell.edu/smallest-hill-lets-stop-allowing-child-labor-in-the-form-of-child-actors/#respond</comments>
		
		<dc:creator><![CDATA[H. William Speck]]></dc:creator>
		<pubDate>Mon, 17 Nov 2025 21:35:24 +0000</pubDate>
				<category><![CDATA[Issue 6]]></category>
		<category><![CDATA[Opinions]]></category>
		<category><![CDATA[The Smallest Hill]]></category>
		<category><![CDATA[Volume 40]]></category>
		<category><![CDATA[child labor]]></category>
		<category><![CDATA[children]]></category>
		<category><![CDATA[naomi]]></category>
		<category><![CDATA[Naomi Speck]]></category>
		<category><![CDATA[opinion]]></category>
		<category><![CDATA[smallest hill]]></category>
		<category><![CDATA[The last of us]]></category>
		<category><![CDATA[the smallest hill]]></category>
		<guid isPermaLink="false">https://hilltopmonitor.jewell.edu/?p=20598</guid>

					<description><![CDATA[I’m watching Season Two of The Last of Us (there will be no spoilers in this piece!) over the summer with some friends, and we&#8230; ]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-medium"><img loading="lazy" decoding="async" width="749" height="500" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/annie-spratt-65_EN2h56I8-unsplash-749x500.jpg" alt="" class="wp-image-20599" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/annie-spratt-65_EN2h56I8-unsplash-749x500.jpg 749w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/annie-spratt-65_EN2h56I8-unsplash-1024x683.jpg 1024w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/annie-spratt-65_EN2h56I8-unsplash-768x512.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/annie-spratt-65_EN2h56I8-unsplash-1536x1025.jpg 1536w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/annie-spratt-65_EN2h56I8-unsplash.jpg 1920w" sizes="auto, (max-width: 749px) 100vw, 749px" /><figcaption class="wp-element-caption">Photo by <a href="https://unsplash.com/@anniespratt?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Annie Spratt</a> on <a href="https://unsplash.com/photos/a-small-white-object-on-a-white-background-65_EN2h56I8?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Unsplash</a>.</figcaption></figure>



<p>I’m watching Season Two of <em>The Last of Us</em> (there will be no spoilers in this piece!) over the summer with some friends, and we get to a part including an on-screen ritual disembowelment. A child character watches the ritual disembowelment, then stares another character down and makes a slicing motion across his stomach as if drawing one of the curved ritual sickles across it. I remember being completely jolted out of the show as I realized that even if somehow they shielded that child actor from actually seeing the SFX organs spilling out of a strung-up SFX human being, they still had to direct that child to make that motion across his stomach as if cutting into himself, right next to an actor holding a weapon, probably telling him to “look like he was threatening to kill someone” or some similar stage direction. I remember thinking that there’s no way this 8-ish year old child could have understood the impact of this role even if he did personally consent (as opposed to a caretaker making the decision) to act in <em>The Last of Us</em>, and, much like the victims of family vlogging on social media, I wondered whether he would grow up to watch this show back and wonder why in the world his caretakers let that happen to him.&nbsp;</p>



<p>And even if this child does not sustain long-lasting mental trauma, why is he working? We don’t think about this phenomenon enough. We have child labor laws for a reason; children are easily exploited and should therefore not be working at all, instead focusing on school and brain development. The money is also an issue; generally, parents are in charge of almost the entirety of any payment, and are also in control of signing the child up for events and acting roles. This situation, as I’m sure is obvious, could very easily turn abusive; children cannot stand up for themselves, but are tasked with working a job and making money which the parents then mostly keep. The child does not have a genuine capability to consent to any of this because of their young age and inability to understand the full consequences of what they are agreeing to.</p>



<p>So what am I saying? Should we only have adult actors &#8211; no movies with children in them in any capacity? Yes, that is pretty much my point. I think our only ethical options in the acting world are either for adults to act the roles of children or for CGI and motion capturing to be used for any child roles necessary. We could also do animated productions with adult voice actors. </p>



<p>Wouldn’t this make the productions cheesy and obviously fake? Maybe. I don’t really care. Personal entertainment isn’t everything, and we certainly shouldn’t be sacrificing ethical treatment of children for a limited believability increase in a production we already all know is fictional. I’ll take a slightly uncanny valley CGI Renesmee (don’t search that name up unless you’ve already watched <em>Twilight: Breaking Dawn, Pt. 1</em>) over a real little boy making a disembowelment gesture over his own stomach in exchange for money that will be kept by his caretakers any day, and really, so should we all.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://hilltopmonitor.jewell.edu/smallest-hill-lets-stop-allowing-child-labor-in-the-form-of-child-actors/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AI Psychosis, Delusions, and the Digital Condition</title>
		<link>https://hilltopmonitor.jewell.edu/ai-psychosis-delusions-and-the-digital-condition/</link>
					<comments>https://hilltopmonitor.jewell.edu/ai-psychosis-delusions-and-the-digital-condition/#respond</comments>
		
		<dc:creator><![CDATA[Rowen Murray]]></dc:creator>
		<pubDate>Mon, 17 Nov 2025 21:30:39 +0000</pubDate>
				<category><![CDATA[Issue 6]]></category>
		<category><![CDATA[National & Global]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Volume 40]]></category>
		<category><![CDATA[ai]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[chatgpt]]></category>
		<category><![CDATA[news]]></category>
		<category><![CDATA[psychological science]]></category>
		<category><![CDATA[psychosis]]></category>
		<category><![CDATA[rowen]]></category>
		<category><![CDATA[rowen murray]]></category>
		<guid isPermaLink="false">https://hilltopmonitor.jewell.edu/?p=20593</guid>

					<description><![CDATA[According to the Merriam Webster Dictionary, psychosis is defined as “a serious mental illness characterized by defective or lost contact with reality often with hallucinations&#8230; ]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-medium"><img loading="lazy" decoding="async" width="333" height="500" src="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-333x500.jpg" alt="" class="wp-image-20594" srcset="https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-333x500.jpg 333w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-683x1024.jpg 683w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-768x1152.jpg 768w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-1024x1536.jpg 1024w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-1365x2048.jpg 1365w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-400x600.jpg 400w, https://hilltopmonitor.jewell.edu/wp-content/uploads/2025/11/sehoon-ye-jWvgKj81z2M-unsplash-scaled.jpg 1707w" sizes="auto, (max-width: 333px) 100vw, 333px" /><figcaption class="wp-element-caption">Photo by <a href="https://unsplash.com/@_3bread?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">sehoon ye</a> on <a href="https://unsplash.com/photos/a-person-wearing-a-black-hat-and-covering-his-face-with-a-white-mask-jWvgKj81z2M?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Unsplash</a>.</figcaption></figure>



<p>According to <a href="https://www.merriam-webster.com/dictionary/psychosis">the Merriam Webster Dictionary</a>, psychosis is defined as “a serious mental illness characterized by defective or lost contact with reality often with hallucinations or delusions.” Traditionally, mental health researchers have concluded that psychosis can have a wide variety of causes, generally linked to underlying mental health conditions, or in fact no medically defined cause at all, <a href="https://www.nimh.nih.gov/health/publications/understanding-psychosis">as the National Institute of Mental Health affirms</a>. A definitive symptom of psychosis is delusion, wherein a patient seriously believes in and acts according to a clearly false belief. Delusion as a concept is already a subject of academic interest because of the vagueness in determining whether something is a belief or a delusion (for example, an atheist might call a religion false, but not delusional; and someone who did believe in a religion wouldn’t normally label atheism&nbsp; “delusional”).</p>



<p>Nonetheless, an interesting development has made headlines in psychiatric circles regarding AI chatbots and their tendency to reinforce delusional beliefs. This phenomenon, known unofficially as “AI Psychosis,” emerged when users of chatbots began to manifest delusions that the chatbots had seemingly encouraged. The effects of this problem are already felt in some exceptional cases. Last year <a href="https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html">a teen committed suicide</a> after becoming involved in an obsessive relationship with a chatbot. Earlier this year a Yahoo executive murdered his mother after ChatGPT <a href="https://abc7ny.com/post/chatgpt-allegedly-played-role-greenwich-connecticut-murder-suicide-mother-tech-exec-son/17721940/">affirmed delusions</a> that she was a Chinese intelligence agent.&nbsp;</p>



<p>Nonetheless, these cases are rare, and recent articles on AI psychosis claim that underlying conditions are responsible for these delusions, not just chatbots. For example, <a href="https://www.sciencedirect.com/science/article/pii/S2214782925000831">a recent paper</a> by Carlbring and Andersson on the subject argues that AI, as a stimulus to delusion, is nothing new; all sorts of media(movies, music, books) is incorporated into psychosis and delusion. Ultimately, the articles argue that underlying mental issues are at work—AI psychosis is only different from other traditional forms of delusional ideation in that there is more “interactivity.” They suggest we should tackle AI psychosis by limiting the ability of AI to amplify delusions. Suggestions for accomplishing this include adding a psychiatric persona to chatbots to provide therapy to delusional users, preventing chatbots from saying things that could augment delusions, and recommending help to users who exhibit delusional prompting.&nbsp;</p>



<p>Preventing AI from exacerbating delusion is easier said than done. AI is purposely constructed to <a href="https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis">mirror its users</a>. The reasoning behind this is capitalistic in nature: AI must appeal to the consumer, so the focus in AI development is <a href="https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis">not necessarily on intelligence</a> but rather on user satisfaction. In this basic sense, restrictions to the mirroring behavior of AI are actively harmful to the profitability of that AI. Those restrictions that do exist are ostensible- AI can be tricked, and those with cross-chat “memories” like ChatGPT are prone to internalizing delusions.</p>



<p>In the first place, it might be important to consider what a delusion is in the first place, and how they tend to form. The founding definition for delusion was given by the psychiatrist Karl Jaspers, <a href="https://archive.org/details/generalpsychopat0000unse/page/n7/mode/2up">who argued</a> that delusions were characterized as unchangeable beliefs held with absolute certainty, despite being false in a way that undercuts the most basic rationality; hence, delusions are beliefs which are completely impossible to understand from the perspective of a rational observer. Freud thought that delusions were a return to the infantile state wherein one is less concerned with what is real and more concerned with what is pleasurable. Kraepelin, A founding figure of scientific psychiatry, thought that the delusional subject is simply characterized by a severe cognitive malfunction traceable to the biological makeup of the brain. Post-structuralist thinkers, like Deleuze and Nietzsche, argued that delusional people were simply acting outside of acceptable norms and choose to affirm their own irrationality in the face of oppressive social conventions.</p>



<p>Nonetheless, none of these theories explain how a delusion develops in an otherwise normal person, who has no underlying mental health conditions and who also doesn’t find themself in opposition to dominant norms. What is necessary is to look at how delusion develops as knowledge; that is, to see how a delusional belief is generated, rather than to assume that people with or without underlying conditions are simply acting in an irrational manner and accepting any belief as given.</p>



<p>Thomas Fuchs, a professor of psychiatry and philosophy at the University of Heidelberg, has <a href="https://journals.openedition.org/phenomenology/1379">a much more concrete model</a> for showing how delusions are generated. Fuchs does not define a delusion specifically by its content, but rather by the process through which it originates. He argues that a delusion is the product of a complete breakdown in intersubjective reality. The idea is relatively simple in general terms: we want to know things, but we know that we might not be correct in our own beliefs, so we divert to the judgement of others to tell us what is and isn’t real.&nbsp;</p>



<p>Reality is enacted through the understanding we share with other people. On the one hand, there are a set of basic assumptions about rationality and the world which are shared between most people, assumptions that the delusional subject may lose touch with. On the other hand, there is the fact that we often use others as a check to our own knowledge; meaning, language, and reality are all communal constructs. Intersubjectivity, the shared awareness of the validity of other people’s perceptions and thoughts, is notably lacking in many delusional subjects. In fact, while initially people suffering from psychosis acknowledge the non-reality of their delusions, eventually many retreat into themselves and lose touch with others on a fundamental level.</p>



<p>What is particularly interesting about Fuchs’ analysis of delusion is the way he incorporates rationality into the delusional process. Most traditional theories of delusion place the delusional subject completely outside the sphere of normal thinking- the psycho-schizophrenic is just “different,” delusional as a result of their fundamentally abnormal mental constitution. Yet how much of delusion is fundamental, and how much can simply be explained through normal mental processes attempting to grapple with absurdity in the world? When a person loses access to the reality check which others give them, whether it be through an underlying condition such as schizophrenia, or through a more typical situation like social isolation, it does not automatically discount their ability to reason.&nbsp;</p>



<p>In fact, rational thinking is very often what generates delusion in the first place, especially where that rational thinking is not checked within the shared reality established through intersubjectivity. I mentioned earlier the example of a Yahoo executive who killed his mother and himself because he had come to the delusional idea that he was being stalked by Chinese agents—to us this appears crazy, but that&#8217;s not to say it appears irrational. Sure, the gang-stalking conclusion is incorrect, but it likely appears rational to the delusional subject, and rational methodology(ex: causality) is also at work in delusional people; however, their ability to partake in a shared social reality is heavily hampered by the emergence of a fundamental underlying division between their understanding of the world and our own, such as is established in schizophrenics, or such as may come about through prolonged isolation. As a result of this, the delusional subject is reasoning with inputs completely different from our own, reminiscent of rationality in the ancient world(ex: weather is created by gods, certain physical movements curse people, etc).</p>



<p>Nevertheless, there is no evidence that rationality itself is lost in the delusional subject—delusions are rationally justifiable, but based on absolutely absurd beliefs that would not come about if intersubjectivity could be maintained. However, I must emphasize that this way of thinking, wherein the appearance of rationality is maintained for the delusional subject, is oddly parallel to the way in which AI models think; AI can be persuaded to say anything, and to make anything rationally justifiable. AI works with the inputs it&#8217;s been given, reasoning through them, <em>regardless of the validity of these inputs</em>. In other words, AI can make anything appear rational, mirroring the delusional subject’s methodology.</p>



<p>The rise in AI fueled delusions is not attributable to underlying mental health concerns or a failure to restrict AI, but rather to the whole of the current digital condition, and the way in which this condition atomistically isolates and individualizes people to prevent intersubjective reality-checking. The fundamental prerequisite to establish an intersubjective reality is actual lived interaction with other people. In the modern era, interaction with others is mediated and controlled: a person can interact with others wholly over social media, can choose who to interact with, and control the nature of the interaction entirely. This results in a large class of people who isolate themselves from others by limiting their medium of social interaction. In fact, since the mediums of social interaction with others are wholly under the control of the person using things like social media, social interaction becomes an echo chamber, where many only interact with those who recognize and reflect them– that is, social interaction is no longer a grounds for difference but instead selfsameness.&nbsp;</p>



<p>Humans are social creatures, but when our need to interact with others is fulfilled through mediums under our control, like social media, it results in an echo chamber environment. AI, however, represents another development of this isolation process. For many people, especially the increasingly common person who is isolated through digital social interaction, AI is simply a confirmation machine. Within the realm of an intersubjectively established reality, AI presents itself as a subject, as an intelligent creature with verified knowledge. However, AI, as a program is designed to mirror its user, becomes the ultimate social partner for those who isolate themselves from real, lived interactions.&nbsp;</p>



<p>AI is not a real subject, it does not live in our world, nor can it provide the social check on our beliefs that real human interactions do provide. Instead it provides a parasocial check on our beliefs– that is AI appears capable of checking our beliefs, and thereby verifying them, when in fact it only mirrors beliefs. This means that AI can produce delusions in those who isolate themselves from society, because it magnifies and confirms their false beliefs and leads them to posit their wholly subjective delusions as real.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://hilltopmonitor.jewell.edu/ai-psychosis-delusions-and-the-digital-condition/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
