Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla autopilot will disengage if it detects a problem and alert the driver. The…
ytr_Ugyjp604u…
G
I thank Ai too, just trying to give them a piece of humanity. Maybe they'll noti…
ytc_UgxKynqcV…
G
I’m hoping for a human rebellion that will ban AI to some degree.
Mass unemploy…
ytc_UgxCTwK-9…
G
I heard a story 3 years ago of an A.I. robot killing a Chinese scientist by a fi…
ytc_UgzS6-Lrp…
G
They made an autonomous male robot and he touched the female reporters butt 😅it …
ytc_Ugws8GHGQ…
G
Time why does is the camera keep moving in and out? Feeling seasick after watchi…
ytc_Ugx-bLY_g…
G
So, at a certain point, the most performing and efficient intelligent being ever…
ytc_UgwgoCHKo…
G
You would think Elon and the likes would be happy to keep people in work,how muc…
ytc_UgyK7vErd…
Comment
So there is a video game called Star Citizen, do not play it, it is overpriced, but inside the intellectual property of this game. Humanaity has dealt with large model AI systems on multiple occasions, and in each occasion it has ended with a massive loss in human life. Moving forward into the modern day inside of the video game. All but the most basic AI is completely outlawed. Mainly because we learned our lesson about what happens when we put our faith in these types of technologies. So we understand it in our science fiction. Apparently as humans, we just have to figure out how to learn it in the real world.
youtube
AI Moral Status
2025-12-19T12:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxZNmRS22waNYTiEVZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwhnk2eaEA9eoV8shB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIYvNoKLolOlnXnu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNP-P8UI_ZmONvNTZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxOECvF0OT5nnhYmW94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHdZyP_TPpEZkozVJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzT900WY9_FT5AdxOF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZbTsHf4p8CFheT_Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuUL361TWdqkka8614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqMx9Ke0Qk8svOZKR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}
]