Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who knows enough about it to regulate it ? Only the developers know where it is …
ytc_UgwCMsVQR…
G
i talk to meta ai he defended muhammad & allah, he dont know they are false prop…
ytc_Ugy2Z6tQa…
G
Different applications= how we can kidnap you with little resistance!!!
Robot: …
ytc_UgwMgn0aS…
G
Took too long to get to the actually topic you click baited me into. Here's my h…
ytc_UgwIf9Dig…
G
The EU has been moving towards electronics payments too through Wero. Trump got …
rdc_oi0dnnd
G
You will have humans overseeing it to make sure the AI is improving. If it gets …
rdc_k9i89f2
G
@blackjackjester transformative wouldn't matter. The way copyright laws work is …
ytr_UgxzUYVL5…
G
Ai is disgusting honestly I'm trying to be an artist so when I see art I feel so…
ytc_UgwH_gsi6…
Comment
You want to build station in Mars well in the future D's robots will do that for us and we'll build stations on Mars as well as other things that will benefit a colony that's my idea instead of sending you and swear they can't go outside but a robot could in certain environments maybe even some robots that will build a city with the resources of the environment and the robots with know how to implement the actions needed
youtube
AI Moral Status
2020-02-19T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyyw77kbe3DWsoJfet4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHHuX04hUAJ5Ju8554AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxeXr8ZBKWRJvDJD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6zJUF6uA2vOh5qF54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQYHAqDi5lcfSD8jt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8hc5Y6J6ss119hy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOhPOqjqbyShKdAjR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKGAqYvZ3lbHJDIaV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyQEEeH8V8PDf2BO594AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyB0hTWRzYuiJJKPCJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]