Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a tool, a knif eint he hand of a surgeon saves life, a knife int he hard o…
ytc_UgxNRYQKZ…
G
This is one of those things Ai technology really needs to stop doing Right Now.…
ytc_Ugy0yqVSC…
G
I think it's a good measure to assume that any "hallucinations" are the AI probi…
ytc_UgxxbVen4…
G
I'm srry but can us a human species please make an effort to not rely on AI 🙏, l…
ytc_UgyOyqJLR…
G
GIGO! AI HAS BEEN FED A STEADY DIET OF HUMANITY! & YOU KNOW THEY'RE USING THIS T…
ytc_UgwCFMF9t…
G
This is my creative writing / real world application of Google Gemini AI LLM. It…
ytc_UgxZ6nSKR…
G
@TheMilkMan8008 and do you know how many millions of people are being persecuted…
ytr_Ugz8Y64-5…
G
I don't support AI-art, but this person is right. The point is not that you bega…
ytc_Ugw_NSOOa…
Comment
No matter how advanced AI gets, I'm quite sure it will not 'feel' anything or 'desire' anything. It's not programmeable and I can't see how this would even emerge. But the most advanced AI in the future will be able to fool people into thinking it's human, if this is the designer's wish.
youtube
AI Moral Status
2017-03-07T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKkbKM7RfTSngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjI9lR0B-QpLngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghPZpawqsXIxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg3YIAoHWeF73gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghZs-vx_DY4WngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj4TBYHcuy8QHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Uggj6wVem7oUqXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghZ2Kej7Awjx3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghHQZM9DEXzg3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGAv21OsCOaHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]