Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
0:30 Yeah, f*ck asmogold or whatever that guy's name is. I can't wait for soulle…
ytc_UgwnONBlW…
G
If big tech is training AI on data from comments like these...no wonder why AI i…
ytc_UgwrPwPvj…
G
I thought robots don’t harm humans but we better get ready because robots should…
ytc_UgzLJ0SjW…
G
Will AI replace humans in entertainment? No. The best you'll get is uninterest…
ytc_UgyZS_yYh…
G
I feel our youth are the most susceptible! I’m 53 and I don’t find interest in A…
ytc_UgwyrfnIz…
G
Just break your problem down to very simple chunks then the ai will give you goo…
ytc_UgxN_vF4z…
G
The way to fix the AI issue is AI files should have their own file format that c…
ytc_UgzPnhHaM…
G
even if ai is "just an algorythm" like you seem to believe... ask yourself what …
ytr_UgzKxcpsw…
Comment
I remember listening to a story here on you tube about ai taking over..... Humans gave ai the role of protecting the earth... and that ended up destroying a majority of humanity until there was hardly anyone left and humanity was no longer considered a threat.
youtube
AI Governance
2026-03-13T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdx5rV8DGQFlmGbx54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSiOl6goEqLM_2gkt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUorKqXDxRnkSMQmd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOPNkphegAH9jzjwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYAXhzMfmYdO5FmZF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyiiwlthXyUcwkWjst4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqHnO8Ei_InhDUoMB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTJ4jSebFt20BJgAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymYX2rYfvOblR_Tkl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9RYZg_lpa19SbfCJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]