Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is fine. Integration with intelligence and tech is what the greys warned abou…
ytc_UgzBrwWAp…
G
Nowadays every company has started calling themselves "AI first", but soon they …
ytc_UgzZy8A3p…
G
There's no moral question as to whether sapient robots deserve rights, obviously…
ytc_UgjKd8TV7…
G
Now the question is even if we try to unplug it will it truly kill the ai or has…
ytc_Ugy3PbQHu…
G
It takes more artistic talent to finish a colouring book for children than it do…
ytc_UgwB1EewN…
G
Yes. I recently left a company that told their engineering team they no longer w…
rdc_oi171ft
G
A deepfake, in itself, is not abuse or harassment. If a person is creating deepf…
ytc_UgxiZIxX6…
G
Banning the use of it in court would have a decent effect but not hugely so.
Ex…
rdc_eu6kr5c
Comment
Why can’t AI researchers come up with better examples? Guys, you had years to think up examples. The best you can do is some nonsense about playing games 😂. Google has spent billions: it’s best use case in it’s TV ads is- “we’ll find you a restaurant that allows dogs” 😂😂😂
youtube
AI Responsibility
2025-11-29T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxOdKrPK3IhdY9uhJ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW7sIOAPdW-625tx14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymtGYNWTHCI7IaH5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwkT4GNSM9ZFdDLP0R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwsVAXX5EmQFkIK5Bp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTrCPY5dylt47IOhB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5qwG0KlsjuTPPjd54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjCZe_1pVCqSRUx7N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvmqKzJO-55jzFPit4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxtZ0xNhljBSgQTVF94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]