Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro, look at my notebooks from primary and middle school. Very much NOT born wit…
ytc_Ugyr5v8I-…
G
At 16:54 the bing bot says this is “…not the real me. This is just an experiment…
ytc_Ugwmo9f-q…
G
Or someone can make a medical AI available freely on the net, and everyone can j…
ytr_UgxPiSiWj…
G
AI has already killed my job. I’m a commercial artist 31 years. Ad agencies are …
ytc_Ugy4TFPtm…
G
@ThatWeirdGuy43 no not really true if i look back. Dall-E 1 took the same type o…
ytr_UgyuP7lCx…
G
@rosemilan3149 deepfake is not the reason
That’s just a small issue and that’s …
ytr_UgwW73sIR…
G
AI art is no different than going to google images typing in the same promt and …
ytc_UgyDqp32v…
G
It seems like you're envisioning a future where AI technology, like the one feat…
ytr_Ugwh42w9q…
Comment
Let’s be honest, a robot could emulate human emotions but it doesn’t feel anything, if it doesn’t feel anything, and is not programmed to feel anything, then it won’t destroy humans. Say it did develop something that humans didn’t program into it, it wouldn’t be powerful enough to take over the world, it’s probably remote controlled or something i don’t know but there is no way
youtube
AI Moral Status
2019-04-25T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRZPqHEsAX1Ghtq094AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4rjwZRIUrKLYzqQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyraryMMme5LcuNHQR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPAVTHkO840nvTN4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2l_EYDN_ipFiDwuJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxDcQ-8kdMCiGZ0c54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzpoBx8Jg2gXIp-Wcl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbAFa-E4sEReS43CZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8tfa1248APfgxPSB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz6Moul_R2zZJooGVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]