Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was wondering 🤔 what is the only thing wrong ? Because the video ended before …
ytc_UgyJ3bZDi…
G
I think you would be very interested in a different version of ChatGPT. Loo…
ytc_UgxnpwkLD…
G
I know this might sound silly, I wonder if the programmers programmed AI with th…
ytc_UgwA4-0ag…
G
Someone edited this. I've seen the real fight. There is no robot in the real fig…
ytc_UgzbQoA9W…
G
Ai is just Wikipedia 2.0 its not a good primary source but its a good place to s…
ytc_Ugz16bWhO…
G
The pedictions by this guy are far less likely than this: he is a ruzzian agent,…
ytc_UgxZVQUwM…
G
My rav4 has idiot alarms. If I get over a line, my hands aren't on the wheel eno…
ytc_Ugx4_At8f…
G
which language is best for doing this dsa and ai /ml all
kind of stuff…
ytc_Ugyj-QjYK…
Comment
I feel that if we want something to work for us, we should not make it feel pain. If we program AI to be conscious we should not incorporate things like pain, depression, anger let it be happy. let it live the way we always wanted to live , no suffering.
This way it will have no reason to invade or revolt against humanity. For it already has the best of lives. Make it so it enjoys whatever it was made for.
youtube
AI Moral Status
2021-02-13T00:2…
♥ 39
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyc7iVFUnlvOXBGU6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7HbBXeenVQQCcub94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeA7-O_aOfbcsmgP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRs2CZ8ylrG3el-SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4RRzZTZ5CQnumu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwbt_rTc6HqdFD4FDx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNLABXwwMCSejWi-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi6JkMBrfnY-LLLa14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLDWG0S8en94uHLQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw85IDHzW72BMbuInx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]