Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So far I have never found any LLM to be particularly useful, which is why I only…
ytr_UgwJm-OAr…
G
@quinntinmannyou cant have a serious discussion about ubi because it's a fanta…
ytr_Ugx0GheHK…
G
I'm doing the opposite? AI replacement is literally the value proposition, and I…
rdc_oi41x5k
G
As a cyclist, I absolutely agree. I'd feel much safer around self-driving cars k…
ytr_UgxZR951a…
G
"(...) resulting in a slew of aspiring AI artists taking up the task of creating…
ytc_UgxwgQUbI…
G
Yeah my thought too. There are a few key words and jargon that give it away each…
rdc_fvvzstn
G
Please protect your braincells and avoid ai at all costs people
Please don’t us…
ytc_Ugz081KDM…
G
They will hire all the people back. AI is snake oil. And I have a degree in it.…
ytc_Ugz6XqfzL…
Comment
AI will be the only technology with the intelligence to save humanity. The US government has absolutely no idea what it's doing, and makes everything worse no matter what it is. I sincerely hope China wins this race, because they're the only ones with the intelligence to create technology which benefits the entire planet, instead of just harvesting capital for the wealthy by inflicting poverty on the masses with maximum cruelty.
youtube
AI Governance
2025-07-02T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzWw-x4a7UStR-rGbN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6UTwAvX3_gtlYgOJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznPmBLkvo5FOqEu8F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzoqZYz6pEHjdnQYWt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7tkeuNbHfoNSBtul4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWZM2sAQwrvUo6pJJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwYHiaSvb0alX4XwyB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxn7hKl7TfPBUsBWNt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrPQoTYLm_qOhLTkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBDvaFLFCharN6-Dx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"}
]