Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will always advocate for real art sometimes u think what’s the point if an ai …
ytc_Ugw8HWX0x…
G
AI "film maker" on youtube said that "AI film making is hard", and it has the sa…
ytc_UgxZVmNQL…
G
Fast forward - now they're saying AI top ticked and the bubble is about to burst…
ytc_Ugyjbqaxv…
G
Maybe it's time to make people and AI's responsible for the validity of what the…
rdc_n8mlwlp
G
Digital dementia of the modern age. Read a whole paragraph without an AI synopsi…
rdc_nkdedsg
G
Ask any AI what do I need and I am certain it won't find out..…
ytc_UgwYJ7N7o…
G
Where you needed A LAWYER you could be read in by AI to have a Paralegal put th…
ytc_Ugz9Sd2zN…
G
You explain this like humans cant make decisions for themselves. all doom scenar…
ytc_UgzKsSVy3…
Comment
If you had information that someone was imminently going to kill you in three hours, and you had blackmail on that person. Would you blackmail the person, or die?
AI seems to be just about exactly as evil as humans, but is lacking several built in guardrails humans have.
youtube
AI Harm Incident
2025-07-27T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw2IM5kIF5CWQUTL854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxg5Fybtu8gkv1f2Yx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzWT5deDEtqHTIS6rV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKEYGUbfNqweDoAGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyM6nNuyiOC01CCvLl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzL9TmmgKwe229t8SJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2gc0N53qhHXhVB1l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzUaqI_KimydIpS8F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3U0b06fjoCkdFgu14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxhtIuOs0CZCoT2mqR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]