Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is learning from humans and soon humans will learn from AI. The question is c…
ytc_Ugw1dMXCg…
G
@immanismjr5606 If i drew a picture of Homer Simpson have i stolen anything, or …
ytr_Ugz8hOkor…
G
I love the perminent look in your eye of "the hell did I just hear people are do…
ytc_UgwWTm5Ie…
G
When they are released with the latest AI, they will also say : Sorry I only dat…
ytc_Ugz6Cwi0d…
G
Kids should not be turned into robots, privacy of each individual should also be…
ytc_Ugy2UFhdz…
G
I'd never hear the end of it from my sisters cause I rip on them for their alpha…
ytc_UgyAAMfB7…
G
What if an artist trains a new AI solely on their own artwork and then uses that…
ytc_UgzkBXAEe…
G
I kind of wouldn’t be surprised if some of the response was also AI but they’re …
ytc_Ugwgf_EmK…
Comment
AI won't need to do anything nefarious to take over anything. Human greed and laziness is so reliable that AI soon will know their creators are a pushover.. Humans will willingly hand over everything for an easy life or an advantage over other humans perfectly willingly.
youtube
AI Moral Status
2025-07-01T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyg10bTuFC7osW6pwZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymieBgfWROVpC0DGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzsYsTVUzz0D9sjTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzNQxJJWTYJTy8pEdB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxt5Bt6sHT68gAES0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxDTP0DDmkIv7K9MHp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxfDnPM8xmGvcg1aaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxONx_4BOXD2bM9wfN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxioKzRzykGqjRa0Ox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXs-5V1905ubVJ_Qt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]