Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, it does bundle all of our knowledge together after all, so AI is obviously…
ytc_UgxD7Z7rS…
G
Just to be clear, this AI deepfake porn thing is disgusting and beyond disrespec…
ytc_UgwqFI8vZ…
G
Smug and blur doesn’t require ripping out/copying/being trained off of someone e…
ytc_Ugz5lQvcb…
G
AI art wouldn’t be an issue if the government actually took care of its citizens…
ytc_UgyGfm2sj…
G
@ironheadgaming6270 and producing ai art takes a button ( photography take more …
ytr_UgxLRPFKL…
G
and the only people who will be capable of doing anything meaningful are autisti…
ytr_UgzMOa81P…
G
Starting this: Ai doesn't understand language and is just putting down what it's…
ytc_Ugz4s32do…
G
Work needs to be automated on all ends. Unless you have a severe passion for ent…
ytc_UgxJ8O4yT…
Comment
You people are all losing the plot of this.
It's not like the drone will be operating with no authorization, if it decides to kill someone in the area that was deployed in, this is not how it works. When human deploys the drone into the area, the human *automatically* gives it authorization to kill all the enemies in the area because the drone operates independently. Same situation if you deploy a squad of people and you don't keep communication for security reasons, so they're operating in the dark, and they have orders to hold some area and they spot enemies, and they engage them and kill them. Does it mean that they operate with no authorization as well? No, you deployed them, you gave them authorization
youtube
2024-08-13T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzxGKMkgqoV0pBS8p54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzia2K7U580HJ4Lycx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRhPADt1MShC9qPAh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxUZqMDwEFLj6P8wAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwgNL5xYsiwehxT8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwWwlICTpxuMS3GrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwU3oSiRNJiPafd4g14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwyYFZInXTOR1ztDlB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyzw5bwZk_3BE2LcSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2u_taAI4q-n-QQIN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}]