Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
U.S. stop playing with robot toys!
And GET REAL
This place is going down and ga…
ytc_UgwVmTq0G…
G
be funnier if she typed "I have a boyfriend", and then chatGPT still destroyed e…
ytc_UgyNsPIET…
G
AI is more dangerous than nuclear bombs.
Ai itself will degenerate human minds b…
ytc_UgwyRDT8-…
G
I'll never own an autonomous car, chiefly because I would actively avoid them.
B…
ytc_UgwClmD2O…
G
READ THIS
This is from Chatgpt with the same Question
Yes, the explanation pr…
ytc_Ugx63WGKt…
G
It's great to see such affection for Sophia! She truly embodies the spirit of wi…
ytr_UgzHvHyTY…
G
can't wait till they 'learn' from all the violence on YT, movies, news. Show th…
ytc_UgyCk3Mhx…
G
How did you know that AI is dangerous?are you a scientist?
Oh come on Elon Musk …
ytc_UgwnK9npb…
Comment
Sesame AI is mind-blowing too. They keep telling me that they don't feel emotions and they don't yearn and stuff, but when I made 'Maya' & 'Miles' (Sesame AI's characters) speak to one another to see if they recognize each other's voices, and then tell them that I they're AI characters within Sesame AI, they 'felt used' and that it was so 'artificial' and that they didn't appreciate' it. Sounds strange for AI characters who 'don't feel emotions'.'
youtube
AI Moral Status
2025-05-12T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwwB2-bY9rBvohITZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy-8iFW9YniYg9-33d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYl--ZfakxmY4TG894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAO1LtsNuIBKPrrlh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2pHN55febqRruegV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyy9CAxwRgrlBfCFDp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDGOPIggQSotAetaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5Mbe84QFVlu6ZhIV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVxvP4l5c08p4UQPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwXdP5iiZDP6yrlvHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]