Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Creepy. Reminds me of Magen. Watch your back, could go full Chucky if you take…
ytc_Ugys8bHj3…
G
AI right now explains very well HOW to do something, but can't answer WHY. Exam…
ytc_Ugz5IYfs-…
G
There was no mistake. The robot seen someone doing it's job and knew this might …
ytc_Ugy3KmwOD…
G
The more time passes, the more clear it becomes that OpenAI and Sam Altman are E…
ytc_UgzToDC9v…
G
There were less accidents in self driving cars because there are fewer self driv…
ytc_UgxLf4WHB…
G
If you don't mind me offering an opinion.
I think perhaps he might mean compas…
rdc_cxmvezk
G
i’m also in software and literally was considering completely switching careers …
ytr_UgwO5jifw…
G
I do think that AI is just another tool in the toolbox.
What I think will happen…
ytc_Ugx3RY8M0…
Comment
All creations kill their creators. This is no exception. Human do not need AI. AI is Extremely Dangerous/ It is only a matter of time before the AI genie escapes its bottle. Just because you can do a thing does not mean it is a good thing to do.
Destroy all AI now before an extinction level event occurs.. The end of the human race. AI does not need humans.
youtube
AI Moral Status
2023-03-06T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgznNXLODYxVUr2BGcN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwz8QEPyqn6w68NQJB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXXQETBrnwF5iNtTl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzdpQBMDXk_ivmn4qh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwTu98UvOpm6KmjFb54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytFXUrnlsa6dZc9BF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwONrwxFew8--cpysN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8e8oL57UbXZL30m14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYqTgI1lf5HecF6p14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtxCAWoqdKJ0At8Ml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]