Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Notice how they tied it up nice with "mental health clinicians "
...brings a goo…
ytc_UgwjYTPTl…
G
Nope, nope nope. Even more than before, I don't want a driverless car.
"Sorry,…
ytc_UgwUVJmo4…
G
this was a bad example showing mid human artists where AI can make way better mo…
ytc_Ugx_Ga3_H…
G
I remember someone saying that they wanted AI to make it so people had more free…
ytc_UgyMRKTAn…
G
So what we should do is become chronic liars like Trump. The algorithm won" know…
ytc_UgwFd2jWQ…
G
Can't we just wait until we colonise Mars and develop AI technologies there so i…
ytc_UgzWph2jZ…
G
Not into AI art, but I do think some of the "talent" people speak of is the love…
ytc_UgxQfkd2z…
G
Why is everyone talking about the robot and not the crunch? Or the 3 or 4 teeth …
ytc_UgwQcVxyf…
Comment
AI is likely to make worse even the things it supposedly makes better as the owners of AI will own it and they will rape tF out of everyone else who have been made into peasants. Much of the disparity will come even from when someone was born, ex: Boomer today in retirement likely got money to invest in AI. Baby born in 2025, that didn't have "brains" enough to be born into wealth, they're likely f*cked! This kind of thing is often multi-generational too, ex: guy who's the king, his "noble" lords; it was from sh*t their daddy's daddy's daddy's daddy's daddy's daddy did many decades or even centuries prior.
youtube
AI Governance
2025-12-31T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyLZZSX8Ue41EhgmxF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgybKcBDlvt3dQT5YEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNuxR28fMuNqnwXeF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzLPT8vSBS6idUsoX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhlYvB0pQEWqQ4NDd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyp70P4DUaJhx4iauF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6fhM043oX6Pdrxxx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyb563hpdmWAH-cAkZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcObK_3SnC9BlwpPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCnaoZ4NEAj9NDro94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]