Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It annoys me because AI should simply be a tool. You know, maybe have it proofre…
ytc_UgyH7wF8O…
G
I have one confusion , if there is no electricity can even that time robot work …
ytc_UgzdYt53s…
G
its a tool only if the ai "art" is used for inspiration or something like that…
ytr_Ugwpn5vnl…
G
In San Francisco autonomous vehicles are already quite present on the roadways. …
ytc_Ugz1XAfFy…
G
I live in northern Maryland. PSEG wants to build new transmission lines from VA…
ytc_UgzEojSsm…
G
For Driverless Cars, a Moral Dilemma: Who Lives or Dies? the drivers who get shi…
ytc_UgiDRHNP6…
G
@goahnary Deep learning only learns from mistakes, so it would need constant cr…
ytr_Ugxj1ZAew…
G
If AI continues to be developed without restraints, the only place people will b…
ytc_Ugz1HKWG2…
Comment
Like I say USA Land of Opportunities Managed to Allow Scientists of Experiments in Creating the Atomic Nuclear Bombs Programs Dept of Defense that Spanned the Globe for Other Nations to Use to Threaten Humanities Complete Annihilation For the Most Part we Can Control this Yet AI Robots In Recognitions They can Recreate Themselves without The Humanities Is Astounding I say Now those who Paid into it's Possibilities Have Been given A Choice Just because we Know we can Use these Technologies to Control The Narrative Doesn't Mean we Should Use them to Destroy Us
youtube
AI Responsibility
2025-07-28T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxB-9kHQwbb7fHG2tx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUxNG7cTPQbe9QnS14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVAOcuRgaFEGIiD3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7qXS-QdBP6CbnljR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwdqz4enlNC3-_l9NV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgwcDNTd8ARakHq2EqN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyl796yAI3y8VaX1N54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyMCHd1mFe_BB4YCl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhpL5Qu7cHfLBURaJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyrii6Rx67PJQ8dG14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]