Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's been honest, I train with AI courses now. I've met a lot of homeschooled e…
ytc_Ugzo1I0L3…
G
I believe everything has a conscious. A pillow has one, but since it has no exte…
ytc_UgjPKZV0G…
G
Having been born in the 90s, let me tell ya, this has been one heck of a wild ri…
ytc_Ugzp2LHRA…
G
I laughed at that statement too. It's almost 100 years since humans first used n…
ytr_UgxaWRVCO…
G
That's a great point! The name Sophia indeed has significant roots in various cu…
ytr_UgyafGFb9…
G
I have usted full self driving in many cities in Mexico and it works amazingly w…
ytc_UgyESfp9T…
G
I'm a translator and my boss also warned me about this. big F for AI…
ytc_UgyXWePep…
G
@SzalonyKucharz Of course it can't mean at the neuron level without being physic…
ytr_UgxsQhmJh…
Comment
4:20 - exactly! AI (mostly these days LLMs) are just pattern matching (ie, following an algorithm). Complex - yes, but still pattern matching. If an AI goes about their day, buying a coffee then randomly throwing it in another AI’s face - thereby showing (albeit weird) free will, I’d start to worry. Until then “it” is just choosing their next action based on what is appropriate next. Humans have free will and can/do choose their actions accordingly.
That’s not to say AI isn’t dangerous or difficult to understand by us, just not conscious or “living”.
youtube
AI Moral Status
2026-03-07T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz1w6lsVfn9ZWAFJd94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugydi4vWsxS-pRizWdB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxd_VMFm1G9peONPnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwErJkEXcMleRpCdRd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzD1YbyeNuJFCDpP1x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwLLOY_Ap4cnTImHwN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlZxHStz7M9b0uh3N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiFUjtTrpruQbaxrV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygdhKUDPjWnT3KhbJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaFf1QhdKSo_Awnax4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]