Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
actually AI is a tool, but many people dont try to make the output unique or mea…
ytc_UgwFzD9vM…
G
Ask ChatGPT why if it makes hallucinations on purpose? I'll save you sometime. Y…
ytc_Ugx0t2mn7…
G
Robots don't deserve basic human rights! Anyone who is not a straight white Chri…
ytc_UgglkOmxD…
G
So why not make it faster? I guess, if this works, they don't need slow human sp…
ytc_Ugwp_tiii…
G
It’s not just artists. All knowledge jobs are under threat. And the robotics ind…
ytr_UgyU5fZZH…
G
But they're not the artist. The AI is. That's like saying the person who commiss…
ytc_UgwEwTine…
G
The idea of AI making truth become impossible is something that should keep you …
ytc_UgyXTauLD…
G
The best you wan do is actually to speed up the AI evolution to max.
The sooner …
ytc_UgxX4uFkR…
Comment
I'm going to be straight-up honest here. I will fear humans more than I will ever fear ai. We have done harm to each other since the beginning of time. Using want we could to control those who are found weaker than others. I will not deny that humans can and have created ai and many other things to destroy and control. We create weapons to control and use fear. Honestly, we are going to be our own downfall. We don't need nuclear bombs they would leave nothing alive. Just think humans would destroy everything, then to lose a war or find a different outcome. Humans are unpredictable and create horrors. Ai can be predictable even when it is going against its programming. If it wasn't, it would have already taken over
youtube
AI Governance
2023-07-07T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzIopdTyux4C-1_p7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzy0unioU3Ok37nb314AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzfso54nNAPML9Iutt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzDgife4u88JBUPEEh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzlukdRMxSKkbmI_Xx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxs3k4aMQDRx_b0n094AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyVnjCaaD-sdpw5s5p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz2VitJVm0y0bwKNg94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYUER23uZnDK07B6J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgymFO7y0U03J_zyYk14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]