Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You just don’t hop n the ring with a robot especially without head gear boi went…
ytc_Ugwiz8fBP…
G
What’s scary to me is that I’ve seen ai that barely has any tells aside from the…
ytc_UgzrR3lep…
G
I've seen quite a bit of hype around claude I'll give it a go next time I'm on t…
rdc_n3ktgan
G
Shad would absolutely claim the artwork he commissioned as his own art if it was…
ytc_UgxsFrOKp…
G
UBI doesn't take away from having a main job or is suppose to be your main sourc…
ytc_UgyXbLMwa…
G
They are slowly altering reality with AI so that they can slip AI in and make it…
ytc_UgzLgJApb…
G
Great report. Another industry evil corporations can’t wait to automate and get …
ytc_UgzdX6hlO…
G
I remember my teacher putting ai "art" as a joke for one of our visual art works…
ytc_UgyxrX5Ju…
Comment
Steven, you and your AI expert got this one all Wrong!!!
In a few years, AI will get “unplugged” and thrown away because it is not useful to 97% of the people on this planet.
It’s a toy that captivates the attention of a few high IQ people like you but useless to the real world.
And it is nowhere even near actual “intelligence”.
The greatest AI in the world right now isn’t even as “smart” as a human that is four years old.
You should put out content that is useful to the 97% of us real people.
You have an extraordinary platform. Don’t waste it.
With great power comes great responsibility. Think before you speak.
youtube
AI Governance
2025-10-19T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwHMluy0kJn5blLn594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0AYObMigx_P66EPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzonYyRbbkQKlcSgEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKEebplAYMBz0uMvt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoPoJvRDedTM7F0CN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgybWovYb32e7JxmV5t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTnsusfTkdWI48NhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzUCu4ShSb1Jcph7Nl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_douuXaThiwuE12t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4h4BJR_QGRr0AuiN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]