Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@neondagger Logic, in itself, is nothing more than a subjectively valuable tool…
ytr_UgynjEvNK…
G
How can we be sure you're not just telling us what you want us to believe?…
ytc_Ugwp2XwTJ…
G
I use AI art as a reference material, then I would draw myself an original piece…
ytc_UgxFYlubL…
G
I have a strong hunch that not all important decisions in life provide a rationa…
ytc_UgxFlgGKL…
G
Ai will not ever become human. It may take on traits of human behavior but it wi…
ytr_Ugw1yojK2…
G
"Why do we need a driverless truck?"
Take the driver out of the truck? Eliminate…
ytc_UgxMyO-tg…
G
Well unlike me, the robot in the wery beginning has a purpose in life even if i…
ytc_UgiuuZL8z…
G
Conciousness doesn't include having feelings, robots would be generally smarter …
ytc_UghqesxgJ…
Comment
Typical Trump administration wanting to do something that is dumb as hell!
Sure, go ahead. Make a super powerful AI and not understand how it actually works.
I'm pretty sure this is what happened right before the AI went after John Connor
youtube
AI Moral Status
2025-06-07T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw7tI2LYkYkpxpxSjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwELeObGbQmcYJ4eL14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsVdlYCbV4AC0io1F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugx3DWUNau0yRs68pHd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZjabhSg4jkrRg-iR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaYxIr8zls3nunHqV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzlH2dJPg2ZJrGDQNp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwlO2VoLnRulRQlBst4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwURvkICBVZio1hThd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUv9VCWS9WhW19FkF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]