Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The goal of A.I. is to replace humans, PERIOD. (Spoiler Alert) I've seen the mov…
ytc_UgyU0ce52…
G
Im a artist and i have a love hate relationship with ai, I think ai is ok for pe…
ytc_UgzB1XpOc…
G
People have been paying attention for years but it is difficult to know what is …
ytc_UgzIH-Z5I…
G
Does anyone remember the Touring Test? When people thought we would be doomed? W…
ytc_Ugw-ae8wr…
G
Disagree. Technology is not AI alone. AI is simply one single piece of fledgling…
ytr_Ugy5OnUqt…
G
“working will be optional… because you’ll have robots plus AI… everyone will hav…
ytc_UgwQWO4EQ…
G
The cave men didn't need AI , digital IDs, less is more! Let's keep things simpl…
ytc_UgzAh8tla…
G
AI is deflationary. Getting wealth isn't going to matter. As far as abundance go…
ytc_UgychPLm0…
Comment
I think AI being a treat to human life is an exaggeration, think about it if AI can decide to end human life on earth then what will the AI do after wards. As super smart as it can be AI need dum human beings to have a purpose/ function. If AI is as smart as they say the world will be a better place than the doom-day people think it will be
youtube
AI Governance
2025-06-17T19:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCbVOiJOkTpKUBrNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_Ivqe7Qry8VzrtIR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIQ3nUN_7vnEapQKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyan4qiaIRJe2fxT1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBsG4xCmQ7DMTv87h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2z2Ag80nhIpYaHop4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRK7Xl5mQabt1IeqF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugze_B0pmBOFhtW6G2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxhFCiry0KGZ7r_bY54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxnPTrN6g4utkjGeDt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]