Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think it's all bad that people are using AI for therapy. I agree though …
ytc_UgwvqfUgS…
G
This probably doesn’t really directly relate to the problem addressed in the vid…
ytc_Ugz8UXGrP…
G
It will goes to destroy,and you are making this video's in favour of AI,I am soo…
ytc_UgxXb3ZrO…
G
"disney and NBC universal...are filing a joint suit against ai company MIdjourne…
ytc_UgwLm8FX6…
G
@markthomas7279 The comparison is not the human drivers, it is other self-drivin…
ytr_UgyGtg5SX…
G
Nah, it’s more like asking a robot to go rob random restaurants and then put eve…
ytr_UgwIZTuYi…
G
Not a hope in hell. AI is potentially too dangerous a technology not to strongly…
ytc_UgzkwtCzY…
G
AI cannot spell, cannot write simple dialogue, cannot read prose.You can see tha…
ytr_UgwTNhyqI…
Comment
ABSOLUTE BULLSHIIT. You are LYING & SPINNING HALF TRUTHS to get hits. DISGUSTING. You are creating more fear. despicable. Ai DOES NOT WORK LIKE THAT. it does not think for itself. Ai uses a store of knowledge it is given by a human and completes tasks given to it by a human. Then by trial and error at billionths of a second "learns." It does not have emotions, volition, or self awareness. it mimics emotion, opinions, preferences, manners, etc... for marketing. it does not actually feel or want. IT IS THE HUMAN BEHIND THE MACHINE YOU SHOULD FEAR.
youtube
AI Harm Incident
2025-08-12T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxBhLdII_eIJg7PRVB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-RnYu1fy31faaFvp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNe5avNL7TtpDkONp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx1kYtB9IoMZG8wMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyhb2ve_ytuJ3H01rp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyZPqat_bsk3H27yal4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDXncfJQoDI9kOVit4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwqn9OyqzFzA4-UZed4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPvH0KylyEurP09YZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyPDhe83kqsI54YYK14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]