Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I totally understand your concerns! It’s natural to feel a bit uneasy about the …
ytr_UgyAQs8ZD…
G
Ai can only go so far with it being new. Give it time. With some post production…
ytr_Ugwx9JW-G…
G
Who would bother to stay in a meeting with some jackoff's AI clone?
What use is…
rdc_oh1jqxp
G
You don't need to be an expert. The media has been doing this long before AI. Us…
ytc_Ugx5-AKep…
G
AI is too new to be advanced enough to be even close to consciousness, but that …
ytc_Ugzx2ujpM…
G
Ai will destroy its self in one pile of plastic and wires this model you talk ab…
ytc_UgyySyF3a…
G
first of all a computer is not PERSON END OF DEBATE when he says the computer …
ytc_UgwWNs-GZ…
G
Finally desk jobs will dissappear, lazy people will have to start working now, t…
ytc_UgxDODM7A…
Comment
That guy from the Economist gives me the shivers. The reason we don't want machines to make kill decisions is precisely to not make it easy to kill. That there has to be an empathic and feeling (something AI can't) being behind the trigger. Someone that can refuse an order from a commander that is not in connection with what is going on in the field. An army of mindless AI-drones that just follows orders and kills whatever it is told to kill by their commander is going to be every little rotten dictators dream. And it is going to be the arms industry's big cash cow the comming decades. Autonomous killing machines is something increadibly evil that should be banned internationally before they develop these systems.
youtube
2025-03-22T05:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwUhTR3w9MjAVHFI714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzStj9WmuXY7leI9AB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzStj9WmuXY7leI9AB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwvHiIQIPEmHGPXBjh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugz4qgZb0gYjNZdOVmp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwQ2VO8SkrmSQVniUV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwQ2VO8SkrmSQVniUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwQ2VO8SkrmSQVniUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwcEMDPPLLPH69164N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz4JyiYtPfYmGkPzJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]