Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Giving machine gun in the hand of robot is like releasing Emraan Hashmi in bevy …
ytc_UgwksUqEF…
G
I simply don't respect AI art. Yeah, I guess the end product looks meh. But to m…
ytc_UgwuaFiAP…
G
I work at a middle school in california. this has been in an issue here as well.…
ytc_UgwBZMP7P…
G
The reason why the EU regulations don't deal with military or defense use of AI …
ytc_UgydW-in4…
G
I wouldn’t say I’m a real artist, but I have been improving on my art. And the f…
ytc_UgxiLbmmS…
G
Ugh looks like ChatGPT developed clinical anxiety.
Does anyone know how to cod…
rdc_jvn8ak8
G
Another possibility is that the AI simply builds over us like how we build roads…
ytc_UgxrY8I3o…
G
1% in AI
99% jobless, having no ability to spend, & this results in societal col…
ytc_Ugw-AUeq6…
Comment
There's a reason why these models are called Large Language Models and not Large Reasoning Models or Large Thinking Models. These models are only trained to be good In languages, not reasoning or thinking. They make good salesbots, but they can't think to save their lives.
youtube
AI Responsibility
2023-06-11T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzF5EcgYJ9F4oTAF7l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXqjdeunxkSq02cLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3DAITwKJDcAk-6GV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwt80nqRCC8O4IIYMx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7int5xfY9YJyPtmp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA3ubUQBC4fPmDMJ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxju3omli0ZJzfx2u94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZyx6HuAQU2ekNNM14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz80jrcP-H2uDzvDEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXGWq9drkiPM0BRpx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]