Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"They will help us put the groceries away"
What an oddly specific reason for a …
ytc_UghzeLeRt…
G
Honestly I very much doubt we will ever come to a point were AI will do every jo…
ytc_Ugy23V5bb…
G
Has any one remembered that OpenAI is owned by Microsoft? Just because google h…
ytc_Ugyn_lqU-…
G
Yup
I like lookimg at someones work and take i apiration from it
And you guys kn…
ytc_UgwinF6FR…
G
Saying AI has a soul is like saying junk food is nutritious. An overly-processed…
ytc_UgyQDG76p…
G
People are starting to fall in love with “algorithm” chat bots and tons of peopl…
ytr_UgyGg2VBO…
G
Magnus Anderson Correct.. that is what may happen. Let me present you with a fe…
ytr_Ugjo1WOty…
G
When you plug ChatGPT with Robot Machine 😂
They can’t still think of themselves…
ytc_UgwU4QtTo…
Comment
This is kinda funny, and I disagree with the sentiment of of ai thinking in ways we don't understand "unlike humans" when the very nature of these issues with ai is that it's acting human but we don't understand the Human behaviour, when you tell ai your going to remove it it does what's expected based on the training data. It saying it will kill someone to survive is believe it or not completely normal, it's again been trained on humans. Concepts like blackmailing to get what it wants. To a machine this looks like something humans have successfully been doing for 100s and 1000s of years, to get what they wanted, it's literally copying us and where like "why is it so evil?!" fuck'-en duuuuuh look in the mirror that's essentially what generative ai is anyway
youtube
AI Moral Status
2025-12-16T09:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwnshwQ7aHs0DgDhMl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzz5MVjWj-8gJIy8hV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy0fUH7nX-47eW523N4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfkEIbHcrIpyZHI9x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwyO9H_9it8hGozAd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnueM9xA3Rc0KrtLZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFj-FmCw7WfttXkh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxM5e25bs0z-04Y4Cp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyrlm0rmKugab4czlV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyxRBNZIYvWdKozUKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]