Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What happened is really tragic but you can't sue AI for what this kid was going …
ytc_UgyXiK2fs…
G
@zilvart238Tesla hasn't had full autonomous FSD, you can't compare the two. It…
ytr_UgwBTwCPU…
G
I am pretty sure she became a public figure before AI.... Also pretty disgusting…
ytr_Ugxm7hZyR…
G
Will AI provide Food and shelter good living to worlds population irrespective o…
ytc_UgyQ44Svg…
G
I just don’t get how this really intelligent guy, who supposedly is so smart tha…
ytc_UgxIGKpyZ…
G
Hello Pro-AI person here!
I could agree witth your takes to somewhat of a middl…
ytc_Ugztaiq0E…
G
Her skin is Too perfect to be real, need to humanise some more. The skin texture…
ytc_Ugzo8MviP…
G
i mostly use ai only for copy-paste stuff. Or when i don't get in my head, how a…
ytc_UgwPnH5MH…
Comment
the problem isn't is AI working, the problem is we think can't tell if it is or not. there are already people who died because they are unaware the information they got is from an AI because we can't distinct AI data from human created data. AI need to be treated like cigarette. where meta data is attract to every piece of data created by AI. and make it against the law to use AI without meta data.
youtube
AI Moral Status
2025-08-23T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDJaIo057RMWQXADZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoFYvqrkRkcx0MAVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxkMtg4QK-0QfNLMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOp1o9FtdXN0j7jMN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuwC3P9FSTALHJZ3B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNODDMh05sH0NdE8p4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJ9XBWpNx_-HuivPd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH0UTBHmPTEsp_cyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwFMw7p1LwsrVyYomx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMNf21re86QQhr5BF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]