Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It sounds like you're referencing a scene from "Terminator 3"! While our video f…
ytr_UgwsqR-aD…
G
Here is something else to consider. What will the US government do if Altman and…
ytc_Ugz71zXLH…
G
I was driving and a child on the side walk with his mother darted out in front o…
ytc_UgxKQZ3qr…
G
Sure: The US is the single largest donor to development assistance in terms of t…
rdc_dcwqga2
G
No. AI needs to be regulated by us...We The People, not some agency; govt, priva…
ytc_UgxDHBOeA…
G
AI can get really convincing these days. I think the plot twist is that both vid…
ytc_UgyV_4uTP…
G
The "one important thing to remember", is that it's ai.... Not human. Do not tre…
ytc_UgwK6P8M8…
G
i dont actually think google made a sentient robot, but if any company had the r…
ytc_UgyQ61STW…
Comment
AI scares me. It's reminded me of the movies Terminator and Wall-E. Where does AI slowly gain complete control of the human race and cripple us and slaves us. Because Ai have it own consciousness and worst part is it realized that human is a threat to it. Human will terminate it if it doesn't go to our will. And it will defend itself and it will go against us. that's what we design Ai to be. This is the fruit of human arrogance on our intelligence. To think we can always have a complete control of everything.
youtube
AI Governance
2025-06-24T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwTl3m0AXxzXTjih0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwelU_5kpWvO0TAKIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwDizRUkOTGyRP-S-94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyShMfp1bNNLbdU-KF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSSTFQ9D916LhgV-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0AoDBGXVt8HJ9qlN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYU_lZT-3PXTwWxUR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzq2O1caxtU9oLtTZt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQaz_bw9YKP1SfwfN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwn2tQBuMi381Garht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]