Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We want ai to do the basics so we can do what we love not the other way round…
ytc_UgwePf1nF…
G
AI is exponentially more dangerous than climate change ever could be and anyone …
ytc_UgwpTLFZ9…
G
I have no earthly idea why this comment is getting hate.
Cameras take light rays…
ytr_UgyttYZ8w…
G
Friendly reminder that people need to comb through data and label it before it c…
ytc_UgxT4R5Rh…
G
@JonnyMOD13
Has to happen. The current system WILL collapse if AI takes all the…
ytr_UgzSt7FYA…
G
The same as the Tesla they should have a driver behind the wheel the same as the…
ytc_UgwXKAGkz…
G
My dad literally did that with a family picture, I mean it doesnt look bad, but …
ytc_Ugx5-lit3…
G
Its sad and im sympathetic to the family but the chatbot wouldnt of been talking…
ytc_UgzGzr_xD…
Comment
AI taking over jobs with more efficiency than humans is the object. Started on assembly lines. Now self driving cars, it will move on to Doctors, and all critical life needs. It will determine what and how humans live. No need for humans to do anything, why waste billions on schools, universities and research centers. We end up as their dumb dummies
youtube
AI Governance
2025-09-06T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwswHnHaifaBBOte-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwF2WpH_3HooUeTvU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaeBMBwph_YWof71Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_r7SSSR6EPrKJri54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvTIt7IFCS6LI_mG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4LjCV4WInSBu24qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzOXUfUZHSbbzW5FFx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUj-5rmZZNE41vy4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVpvAFzdxRpdpj0fV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx96vFLHJct_SAlK494AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]