Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just think about this, in the “Big Beautiful Bill” A.I. is exempt from literally…
ytc_UgxD8sKec…
G
Ugh, this is so sad, sick, and twisted. it's really depressing that ChatGPT read…
ytc_UgzJHvyRr…
G
man this explains why ai detectors flag even handwritten essays, i use trickmeno…
ytc_UgwJqweKU…
G
The real problem is not who developes AI, but who will use it, and in which purp…
ytr_Ugya4kjAL…
G
I got a funny story abt something like that:
I said "dang this was a pain to mak…
ytc_UgxXx4voM…
G
I don't know if that's exactly the way to go, but we need more and more experien…
ytc_UgwK148wj…
G
Humanity doesn't need to be murdered in order to cease to exist. They just need …
ytc_UgyA1-Wkd…
G
Well AI is fed only facts and historical evidence, so their preferences are natu…
ytc_Ugx-aFcM4…
Comment
What's the thing that always gets ignored is that humans can always choose to not make AI too intelligent. After all it's not great for the economy to let a machine be concious, but to only let it do the specific tasks it was assigned for.
youtube
AI Moral Status
2023-11-30T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZy7Tw_2zFp8Qei4B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwr_JeHItVJi0z-COh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzgMjLLf79_TrPDPvh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzqK659iwfbgHvB38h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5RWRc9Fyi6eIXI4d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz_0BaLrF7yhJ268I54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlCCfPcoWeVQffCwB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGqS0I52JT32aqOTJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDD9UqqCZrb9EqBz94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyNzpyyTRyrh9mUuMd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]