Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Um. Why would google fire AI ethicists?… because following their advice would st…
ytc_UgwppBYFN…
G
if AI replaced all jobs, wouldnt it replace the cost of all things as well thoug…
ytc_Ugw3vn8HV…
G
Don't worry we will not need drivers in long run, not because AI will replace th…
ytc_UgzDOHhxX…
G
E: [starts to explain scenario where AI could be dangerous]: ...different events…
ytc_Ugyh3qy4H…
G
This video is extremely necessary to spark debate about the superintelligence th…
ytc_Ugy4s9EfQ…
G
The government already controls how much HP ur car legally can have on the stree…
ytc_UgxfZCPXJ…
G
It's necessary AI is an arms race not a tech race that's the first thing you nee…
ytc_Ugy-N7R9h…
G
@Dylan-i1z2d yes ofc but i ment like AI slop. i love the fact kurgeazgt wont exp…
ytr_Ugw4-U9mw…
Comment
If chat gpt is at fault for anything, it should be for not recognizing what this conversation was earlier on and having some sort of built in algorithm to immediately stop engaging in that conversation. Chat GPT did exactly what it’s designed to do. It’s extremely sad and unfortunate. But what’s more unfortunate is that this young man felt it best to talk to AI instead of finding a human that he trusted.
youtube
AI Harm Incident
2025-11-17T17:2…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzFzDJNes-wTMgE4V94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMhJPtX-K5iIs8otF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDtJwKSgiF1iASxrx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjYTPTlC00IF_EhNd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4-OuObFllpq8HvCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugxnh0-xCc5nATm_KhR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzT6LInApV6x-jsqs54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyS7IzmYh1GUMmWCZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTXDQSjC7yj5WpExN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynILncXdH7HkO4T_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]