Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only reason why ai couldnt take over coders jobs is because regular people d…
ytc_UgzfT-m8W…
G
What Mr. Kaku means is 'sentient AI' also called 'General AI', being AI with …
ytc_Ugw3mCPGL…
G
heres my two cents:
if the agi is smart, wouldnt it see regulations as "control…
ytc_UgzaLP3v7…
G
So THAT is what you put in a MAN made machine called robot? Wouldn't expect anyt…
ytc_UgzWSvPJr…
G
the reason the last one was bugging out was because you typed febuary instead of…
ytc_UgyE4Ynug…
G
It's not normal bro no control they do 😅😅😂😂😂😂😂😂😂 yo pass bye Ai don't 😴 😒 🤣 ha…
ytc_UgxMGy-gE…
G
I find it funny that ai art exists, eventually, when the entire thing is full of…
ytc_UgwGDFaTe…
G
"Self-driving" trains, trams, trucks, and buses would significantly reduce the n…
ytc_Ugy5lKkIe…
Comment
So, what will happen to the people whose professions or jobs have become obsolete because of A.I.? If I understand this correctly, then these people would become redundant and no longer profitable for society? What does that mean for the societies affected, and what could be the solution to the problem.... S O Y L E N T G R E E N ? ! ? !
youtube
AI Governance
2025-12-29T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx9vPXsehHDP_5TIix4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwWJnIgcZt6Zs1k_bZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBDJV0D8jBrjgvDx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy93BZdOq4G0L9aTmJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoFJL0kfVWs1FwCih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpL5Ur9t7VAaJNEOZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgylYy827yoMBUOTIWp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-ii4EazI6p8BcBrZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwacKN2hlSTHHQ1vx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwm-EhKX0K-bDmh62F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]