Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s ridiculous, AI art and hand-drawn art can look the same, but these are diff…
ytc_UgzMmw3FF…
G
My friend and I got caught in a blizzard as we were driving through Vail pass in…
ytc_UgxSTMCB6…
G
The argument that AI steals from copywritten works could be viewed as valid exce…
ytc_UgwNcf7jV…
G
I do like AI, only when it is used right. I still don’t like pure AI videos thou…
ytc_UgxJ_nKjG…
G
This smart guy says we will just simply stop making them 😂😂 he sounds like my 4 …
ytc_UgzdR5tTt…
G
We seriously needa cut this shit out tryna make these ai robots apart of our soc…
ytc_Ugwgp5fux…
G
If I lose my job because of AI, then I won't have the money to buy an AI tool su…
ytc_UgyQsa3qS…
G
What happens if cameras fail on Waymo? Oh its cooked. It cant trust any of its s…
ytr_Ugwy58bG9…
Comment
If AI works in humans' best interests, the question is whose best interests AI will prioritize, a greedy politician or rich person with power doesn't have the same best interests as a common public person, if the greedy person wants money more than caring about the health of others, then they're best interests aren't the interests of everyone else, and if AI acts towards the best interests of the general public then all the greedy rich people in power and leaders of countries will have to be gotten rid of for the best interests of the general public so that AI could take over and run everything.
youtube
AI Governance
2026-02-19T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKJBAO8_Soc1zo25t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzegE7zcJo78G8fY0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzhjf-WG0vK2wCwXPt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXbqNCUjPMnxBr9P14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNhiZYXbdEQ0Fz_ot4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxBHLWGqG7Uek-Z6tB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyI4tjWV8oDqadKkDJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNj9g9HpAVel8VnHp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJU4nFRJjOC9Eo-iJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw9cAQMw1LTegDq3Ld4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]