Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
100 years four us humans,
How much time do you think it will take for an AI runn…
ytc_UgyD94waD…
G
I’m not sure if anyone or anything checks this after 5 months, but you asked for…
ytc_UgyviY5Wk…
G
It’s already gone quite wrong. A handful of jerks like him are determining the p…
ytc_UgxffN5rc…
G
Only man kind would create its own obsolescence, yet knowing this, we continue a…
ytc_Ugw1VhARG…
G
Humans can even agree on the future we want so how in the heck will AI be able t…
ytc_UgwVJgP0k…
G
Are we judging an AI for having self preservation? that's a feature not a proble…
ytc_Ugxsp1ing…
G
"He argues AI is simply another tool, like a brush or camera.
'I would challeng…
ytc_UgxaSYLif…
G
I think there is one version of the future where robotics and AI integrate with …
ytc_UgzNWQhQY…
Comment
if AI eventually takes all jobs and everyone ends up on welfare, who will have extra money to buy products? Wouldn’t all the product companies eventually go bankrupt? , the workforce and the consumer base are largely the same group , so what are we even doing ...
youtube
Viral AI Reaction
2025-11-23T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzuIV4-Q0TtxEoXKEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCsPD0oyBNUBd0ant4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZrfadMbdH3l4W6nl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwsCtAO6fBuhoZrBbt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzrvm4yhXOJEXzrQG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx77tHWcDTp4H-eNwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXVQr7PUKycwO2YBV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwzMWdjXm6YLVItTpZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugy4dpOkhhcDLJd-ckl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyL9d2TfN2BMa0wVHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]