Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You’re basically sending your kid to a homeschool but at school. Stupid. Just ho…
ytc_UgzTbD1sr…
G
no one is going to hook up some AI to control nuclear power plant nor some elect…
ytc_UgySQV4vD…
G
A person making a one to one copy is breaking the law, even if they succeed. But…
ytr_UgxohpLzv…
G
If it's shot for shot, then show the shot for shot
But knowing how AI works, I …
ytc_Ugy9QjLcy…
G
Too many people overlook the "G" in ChatGPT. You know, the word "generative". …
ytc_Ugw0A-3F6…
G
lets use ai to build a civilisation in the universe, and get energy of suns…
ytc_UgwZManbD…
G
I dotn care if people use AI what is wrong is not stating its AI.…
ytc_UgyA3z-xw…
G
I find it faintly unnerving that at the end of the day, their controls for how i…
rdc_lb4cy6z
Comment
We’ve learned nothing from the cautionary tales of War games, Terminator movies, 2001 A.I. artificial intelligence, I am robot. Someone thought up the concepts and considered the consequences and told the story. And yet here we are. When they started this technology they made the decision to not think about what could happen because of ambition selfishness and pride. Look at what we can do, who cares what may happen, we’ll cross that bridge when something happens. Smart people are stupid.
youtube
AI Governance
2025-07-25T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCQpTtIyhmspKxf2V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzE30HAQluEyeGI76N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSKdDu-d0ICscxxa14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJEgqRWMfXhDW3_v94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugya8v2yMMt97FfcZ6B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySE7v1HeRT1kHCJJV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5u-yAazoHemX9Ob94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbL2un8iU8RthS7WF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhT4v_y3p7DhfdMJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyJYsIe1ixFEq2L9HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]