Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One major problem I see with self driving cars is how non-standard a lot of road…
ytc_Ugxm5b3L1…
G
When a country ends up in war, any AI regulations can kiss goodbye... And all le…
ytc_Ugzc6LBTC…
G
it's the same thing when you promt those AI artsy programs to create a "beautifu…
ytc_Ugx_yRst8…
G
The process of creating something new needs something to go off of and that's be…
ytc_Ugw-h75Ev…
G
I’m not using that
Ai is going to take over and it is SCARY bro.…
ytc_Ugy2zAVIA…
G
"guys it's not my fault for using AI blah blah blah economics blah blah oversatu…
ytc_UgwmFsDyo…
G
This shit, is fucking scary. I mean, really. I don't get scared about a great ma…
ytc_UgxkDWkBA…
G
Thank you! Another thing that always comes up in these videos is the claim that …
ytr_Ugyj42kQg…
Comment
This should scare everyone. Imagine a robots/ai smarter than us… that would actually be sooooo terrifying. We wouldn’t kno if it’ll have ill intentions towards us.. and if they did we couldn’t stop it fr.
youtube
AI Governance
2024-12-17T06:5…
♥ 37
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzviWcWDT9w1pAh9iN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUgpEm4DOIO7renDR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwKLHbo3Al_KIIFejd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEqulA5JdSd3MlAJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwVaftWSecpWod8MNF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwy4JirtmX8oCacpi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwomdOYCEwjAYICwqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZjmiNEa2A9yXlaLd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwxfdX661-R0INkL-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGu9Rz0j07ilxwBzF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]