Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is wrong. And also smart people are completely missing common sense. Maybe …
ytc_UgzIjGDmQ…
G
ChatGPT can’t even give me more than just fragmented sentences lately. Or help w…
rdc_mdj4ct7
G
How's replacing humans with AI working out for Klarna? Looks like they're regret…
ytc_UgybA_5QN…
G
I believe we are meant to be like Jesus in our hearts and not in our flesh. But …
ytc_UgzDWOKdt…
G
Scenario 2 will only work for a short period of time before AI replaces people t…
ytc_UgxXqzWpl…
G
That's. Omg. I literally found this out last night with my local ai. We where ha…
ytc_UgwXYELgu…
G
I tried my sketch on one of these AI shit and pooof! I don't even recognize my o…
ytc_UgwcQ3lz_…
G
As the creator, man had the chance to NOT create something that will exceed him,…
ytc_Ugzw90_lk…
Comment
Can technology make wars safer... how about can we find a way to resolve conflict without war? Politicians are horrible people because they see dollar signs when it comes to war, not the lives they send out to die. If they had any remorse for loss of human life they would attempt a peaceful resolve. And no, it wouldn't save lives. What do you think the ai is going to target. Innocent Lives!
youtube
2021-09-08T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxDV9VNkn65y0ssWv54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx25v-LnJWdWqSiXGJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx5Vo1f_6_5OpwuIJd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfHhe4E3x8ZBhCqD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxwR5q8-kKOqo91wv94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpfFSdiir6nernkmh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwnDfgU20L18ewE4BR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykKa-n4CRlX3ydXm94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMEMUwOeFBF7re-hN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4-mjNya6jTm6lP3d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]