Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need to listen to those big guy talk cautiously. AGI could happen someday, bu…
ytc_UgyoQCFnN…
G
First mistake the passenger made was getting into a driverless car! OMG! Hire a …
ytc_UgyfAvm7t…
G
To answer your question.
Because AI has no emotions or empathy. Those come wi…
ytr_UgxyiHwZz…
G
anyone doing 10k pull request lines is doing it wrong. Also, if you're using ai …
ytc_UgziZ-Agj…
G
Sorry but it's a lose lose situation, look at the man arms almost snapped just f…
ytc_Ugxja9MFM…
G
a 5yr old could draw a ghibli scene and it would be more unique and creative th…
ytc_Ugz3HONNm…
G
only Ethics will win longterm!!!
We talk a lot about the risks of AI, but we do…
ytc_UgzviCTev…
G
These is why i don't believe AI will replace everything, mostly because it will …
ytc_UgyWSpJ9W…
Comment
I think if we use droids, robots, and to some limited degree A.I. (even if i have a bias against A.I.) as tools there never needs to be a question of "rights" for a machine. If they remain completely programmed and are never actually attaining emotion or a mind of their own then chances are we'll preserve ourselves in the process.
youtube
AI Moral Status
2017-02-23T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UghrMjkvJvyYYngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggI3A8osDidtHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghmvI-rbPE643gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghqU14UzYTlX3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj1f08yN6lvxngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugi6L3X2cbXbKHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggpTYlx4yYgFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj0GG7r64jiHHgCoAEC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggqmDrEGGZ5_3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg0POrMdU18w3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]