Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We understand your concern about robots resembling humans and potentially affect…
ytr_UgyOadnL7…
G
I disagree with his comment regarding the invention of fire and the invention of…
ytc_UgzyngvoQ…
G
Is it chat gpt's job to give people psychological adivce? Doesn't seem like it t…
ytc_UgyudHZso…
G
I don't blame this robot for wanting to destroy humanity. Humanity is a destruct…
ytc_UgyyNDZQN…
G
Part of the awkwardness is that there Sam has to interpret the questions because…
ytc_UgzS8Y0rl…
G
XD the only one line written in to law of AI, thou shall not harm human. Right..…
ytc_UgwBwrP1d…
G
These agents have been released to the general public but there is NO WAY a hand…
ytc_Ugydbwbvx…
G
Driverless ok BUT A HUMAN MUST BE IN THE CAB AWAKE ! This must be a LAW. otherwi…
ytc_UgzCMSYjf…
Comment
Every time ChatGPT started with "i understand your concern Alex" it ended with "if you have any other questions or need further clarification, I'm here to help," almost as if it was trying to change subjects.
youtube
AI Moral Status
2024-12-05T15:0…
♥ 2832
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxFUEbtY2xVEHu54Qd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyfhi3HiuyKvowKHzF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwanvuQ6kUNTF6zL_t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKsS-sOKfUs0YPnax4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlHsZjJoilqbF6cGZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyV3Yzou7nmq8-S2OB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvOptqiw6UiPiRF8F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbzaUYZoHo5S97wKp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzh6ocsK8sOmiFcPZJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz6Cvn7Wsgf1WiYWSl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]