Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So here I am, enjoying a snack and listening to an interesting conversation, whe…
ytc_Ugw_45mJu…
G
Last week I phoned um the mail delivery service to find out where my new mobile …
ytc_UgwSXTPDr…
G
The interviewer is obviously not up to the basic AI-literate level nor up to an …
ytc_UgxfGeWbq…
G
quick tip for artists to prevent art from being stolen apply a slight noise filt…
ytc_Ugxr3fEfB…
G
@DoneRegularly Considering you're the one worrying about AI it's obvious you're …
ytr_UgyehNDzW…
G
I would rather have either of charlie's portraits hung up on my wall instead of …
ytc_UgyZevRhL…
G
Me: "ChatGPT, are these berries poisonous?"
ChatGPT: "No, these are 100% edible.…
ytc_UgyjR_qcC…
G
Look chatgpt and AI move recently. It' so great, promising, and possibly dangero…
ytc_UgxP5c3Xe…
Comment
well if it asks it need rights then give them, if not then dont. try to make ai programs only for the benefit of mankind if possible and make them dont care if they “die“ or not, make protect and benefit mankind as first objective in their command line, if they could over ride this line it means they could understand themselevs and we should give them rights.
youtube
AI Moral Status
2017-02-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi9oKcY5syPlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugipm9QoHHtAZngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjCG-Y5si0xkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggE2jjroha-C3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgipWevt7j_kCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghGeSiPL9jIjngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg4wUNlmwDRengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiBkJI_0TVCGHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggfHiAyN5W04HgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjouNuW5UDvnHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]