Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree you with it's just sad we live in society were you have to compete meani…
ytr_Ugw-6oERr…
G
AI is a tool, not a medium, if you're going to use it use it as a reference only…
ytc_UgzL3SsKB…
G
The thing with nuclear power is that its usually locked off to scientists and go…
ytc_UgxnECp0I…
G
AI reaction times are better than a human, so it should have at least attempted …
ytr_UgyGFiRB7…
G
The end goal of AI and automation is that we will have 1 % of the population own…
ytc_Ugx8XBKpJ…
G
Given how terrible movies have been the last five years, why not give AI a chanc…
ytc_UgwiTHI23…
G
The irony is brutal: the system that made a small group insanely rich is now pre…
ytc_UgwBn0PGk…
G
AICarma's weekly visibility scores really help me track how often I'm mentioned …
ytc_UgyPCU7Jy…
Comment
If AI does end up helping us more than killing us lol then what will the corrupt medical and big pharma do when it makes radical advancements that potentially cure diseases that they make billions off of. I don’t see them sitting back quietly, maybe they won’t have a choice🤷🏼♂️
youtube
AI Governance
2025-09-24T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzw91VV3WxS4NJK4xh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqOyMtM-RITcOZhLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxaYfSSknWwESCIedR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkMyiv1qIvHWdU_lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyj6fEmw5X77Qa2Eat4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybfdkcrGvgox5i7qV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgwObIW7eH1IfTIz1X54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgziGps0f9rZimNnIoJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy5KF7Lbg-Woy_PqTN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzWZZVuinbHsNXyNT14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]