Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, AI doesn't have the ability to think or reason at all. Of course you can't…
ytc_UgwX5VJil…
G
we have to humanize ai , if we want it to be sentient , its just a better idea t…
ytc_Ugz0ENZc0…
G
*panicking*
i broke the ai filter by accident..
*looks up from spicy chat*
can i…
ytc_UgxjhyTbc…
G
Humans are not good enough, not morally sound enough, to be working with AI. Hum…
ytc_Ugyw0C4Gm…
G
I believe it isn't a hallucination
We can think of training ai as a mini evolut…
ytr_UgwI41YU0…
G
4/1/26 11:48am
the future
No locusts
No rouge superintelligence
Company own…
ytc_UgzcUD7xt…
G
How do you not see that AI would be destructive and destroy humanity? Even just …
ytc_UgxQFl3-R…
G
I was a Yang supporter in 2020 because he spoke the truth about AI before any ot…
ytc_Ugyuvlh4c…
Comment
The thing I dislike with self driving cars is not the cars, but the drivers. They will excuse ABSOLUTE all possible countibility for a crash because «technically the car crashed» MF you are opperating the vehicle! Just because its self driving doesn’t mean you’re riding a taxi
youtube
2024-12-16T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzviTS5TE_VfIomU0d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyK5LVPpge4BrX1QDp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzlBodioIWtrpTf51t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxLf45bswMOaJFjpDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGMggtg909R_ine454AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZaPPH9_ad1lOMA7J4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqVHsIpyl0EqwTMtl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzTnwrzNb99taXoBLV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyVDnCf_Bzu7accFLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzEOXiWNNJRtKP4Y0p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]