Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm a truck driver and I don't think AI will replace us in this job for a long t…
ytc_UgxnZAJte…
G
So it'll end up being consciousness bearing bodies and unconscious super-intelli…
ytc_UgxhJtX6j…
G
Mfw the "Bad beginner art" is still 20x better than mine
(I still don't use AI b…
ytc_UgzRn3RYi…
G
I agree with you with all except the last part of the video, when someone made a…
ytc_UgwZooklZ…
G
The people crying to pause are just assclowns just making noises like chickens c…
ytc_Ugx8hkgo8…
G
Contemporary artists are NOT the most known artists lmao. And my art ISNT going …
ytr_UgwXCUi9c…
G
Also parrots are really smart. They're one of the few animals observed to [be ab…
rdc_j8e19vp
G
Ahahahaha! Pulling the plug on the server solves that one :-) Errant behaviour? …
ytc_UgxKzLbuH…
Comment
Machines weren't meant to coexist wih humans but to someday merge with them. The needs of government will change dramatically if the brains it assimilates with were lucid, knowledgeable, and emotionally sound. Societies invented checks and balances knowing how unfit they were to govern themselves. By inventing a modern day industrial age machine, humanity is brought farther from the brink of destruction, and the AIs we will invent can be trusted to fulfill its duties in making the world safe and prosperous. In conjunction with currently existing data-mining algorithms, AIs will be able to communicate and understand the needs of individuals. But without this system, we head down the path of ignorance and intolerance.
youtube
AI Moral Status
2017-02-24T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjyarnsMmnkGngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghTXyshqik943gCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghQsKYsd-Ki_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgggwBPrVX7wAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggE6SPzi0kvdngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggXexT-TTeXzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjl5NS5pTmJcXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugjc_-iQJM-_LHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg75IgfCGwkrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggJXPMrGWhAjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]