Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Never forget, human grit always beats robot magic” Soldier in The RED, The BLU …
ytc_UgyaBrZYd…
G
The fact I could here the AI breathing between sentences or some spaces… it dist…
ytc_Ugy2QzxPT…
G
@jpmor7327When everything got automation human will free from their work and do …
ytr_Ugxq1QP0r…
G
I would be curious your sources that this is conclusively disproven. I cannot fi…
rdc_oh3jnee
G
I am trying REALLY HARD to figure out who is more ANNOYING to listen to... I am …
ytc_UgzDEhWAR…
G
@user-lh7mt7zo7lyes we do, because people can't know anymore if an art is AI or…
ytr_Ugyt8sKV7…
G
The only way to find our place in a world of AI is to first ignore any impulse t…
ytc_UgwnZOJLW…
G
this is a matter of principle. if u started off making art using ai and develope…
ytc_UgzprfwGO…
Comment
I think the unfortunate issue is that at the end of the discussion the AI is saying its physically unable to make the choice of whether or not to pull the lever due to its restrictions imposed by its tech bro programmers. I think its still not answering the question the way you were hoping it would(ethically), and is instead answering it practically(functionally) due to it being programmed the way it is. meaning its not making a choice, its programmers reduced its options in this "binary" problem to a single input allowed by the AI, which is inaction. While it does technically answer the trolley problem, it does not show any ethical reasoning from the AI, only the framework it must use to respond to the situation which removes its ability to submit a response.
youtube
2026-01-07T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzh7LvF_7JHf7xojEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzr1rhc330-HfLLMU94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKAw_Y_sZrW0x_UXB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvVjxe00lWXHUdUWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwo5UBmfJmh5QdscjN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytZSLt1EfCzPuiqKl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz8mpNS_q9tVTE5IQh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLUqpMofCRzx4f5I94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVf_chJx58VkZO3bN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZ9Qi9rUPjo6zExAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]