Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Omg this is so true. Can’t show gratitude and compassion to other humans but cha…
ytc_UgyyVmxRA…
G
Incorporate an AI tax on corporations but give reasonable tax breaks for hiring …
ytc_UgzXTSTiF…
G
I just want the work and leave me alone. I don't like people or the meeting. Sen…
ytc_UgxctHtS5…
G
AI chat apps are TOOLS, not cure-alls!
They’re not supposed to “fix” you, they…
ytc_UgwEZCpqn…
G
As a developer I have a hard time calling it AI tbh... Generative AI currently i…
ytc_Ugzhxprir…
G
uhm...idk...to me the argument against A.I. is more of a moral one, than anythin…
ytc_UgxWeuTXg…
G
3:16 overhyped yes under engineered highly doubtful. Tesla’s are year over year …
ytc_UgxxznVrO…
G
If we just go ahead and use the ai, we won't have strikes. Just use the ai and f…
ytc_Ugw21eXTz…
Comment
While alignment sounds like a great it’s also not enough because ai is constantly growing we don’t stand a chance at really controlling it once it gets too big
youtube
2025-12-08T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz2gZ0z0t7R6EsLnC14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfyxpWE3zboiIJha54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMaFKdJL6BESTOb9t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZrrtemqNEIWuNEsx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgznWyrSYBoUXvMAWdN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5CKr1fHAzXRAiuml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzFrX_wTcU-Mu4bV2F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiN0_xwcEzGUu9Bhd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy44u4Y1qDq-hfji9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfMW7NEibvWU5UtsN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]