Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Or maybe it is the end of human work, and we can go back to the eden where all t…
ytc_UgwcYe7RG…
G
Once artificial intelligence reaches human level it'll be the new dominant speci…
ytc_UghhKmo9_…
G
chapters 2-3 of the book were very informative, but the rest of it reveals more …
ytc_UgwXqv1G7…
G
first it is not illegal, read TOS for websites you are posting on, second, then …
ytc_UgwJukpti…
G
I understand where you're coming from! The interaction with AI can definitely fe…
ytr_UgxqmFAg3…
G
Gemini said it would pull it and saved the human which means, it meant to save h…
ytc_UgztpKBVq…
G
Robot for testing bullets - is not the best crash test actually. We can’t even s…
ytc_UgyznHaka…
G
Killerwalrus234 I would have to disagree on that last bit. While I currently do …
ytr_UggI9deWo…
Comment
I can’t help but think of that alien prequel movie with Michael Fassbender playing a robot. I don’t remember much about it but I remember what the movie made me philosophise about. Like what if ‘god’ or ‘gods’ created us in their image, but we were superior and essentially took over. And then if we create robots in our image but they are superior in many ways to us and whether intentionally or not, they end up taking over from us. What if this is a cycle of life that is unavoidable? Pretty outlandish I know, but just an interesting theory I had.
youtube
AI Moral Status
2022-11-06T01:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyMLClJZr9zziKHsOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfeNQ7ZiqlrXiNjOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXDZhmUcG3ORGUc_14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgymI4NBwGgCBJWs5F54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7mv8CDpMRt7CmxtN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyncpjAvuqq57oWImt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzc0qa4fBHo0x-a_bV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiEFunAJlCgRezqd14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxfi61DfDQRWlGFpfd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHiXcIxl18ntyTF054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]