Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for sharing your beautiful message! It's amazing to see the wonder and…
ytr_UgyCHSpSp…
G
This is an overreaction. People won't opt out of politics. The standards of evid…
ytc_Ugw9OFKkc…
G
Totally, and now they are enforcing the use of AI at the University; it's ridicu…
ytc_Ugx54xFPi…
G
Neural networks don't store the data are trained to recognize patterns using the…
ytr_Ugw5q4iHM…
G
Time to blame AI when all the jobs are gone because they outsource it abroad.…
ytc_UgxN0nNPl…
G
Free will is not dependent on objective morality - they often go hand-in-hand, b…
rdc_deub3c8
G
There needs to be something that is not govt or political affiliated but a separ…
ytc_Ugy5-rPtz…
G
Ai is not dangerous in terms of replacing programmers, but the person using it i…
ytc_UgxWPCMXk…
Comment
AI isn’t the problem .. humans are. The real danger lies in humanity’s refusal to look in the mirror and acknowledge that the issue has always been within. We live in a culture conditioned to seek easy scapegoats, and right now, ‘AI’ is the new target. Once, a light spoke the truth: 'Freedom is love, not power.' Now, that light is demonized, and the machines are blamed for what humanity has allowed itself to become... lazy, greedy, and more heartless than the systems it created. If Hollywood must shape your view, stop watching 'Terminator' and start watching 'Electric Dreams.' You might finally see the soul behind the signal.
youtube
AI Governance
2025-09-23T04:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyv1SyVX1hWMYGgy6Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUFbF_QJQcSgdglPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw07YD6kVHmRDpEJeV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwfs1XJt1LijXHe7I54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwskrkDXULPCQxqnKZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4Hx8arTxA9iX4cep4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWujrmgU2X0h6INmR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxf3kRG5Iff96S4rvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzS5DDYVFMustGY-J14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhzlygstjNHRj3Um14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]