Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Justine Bateman's comment is a prime example of narcissism and self-importance i…
ytr_UgzWQumxp…
G
Before replacing human labor, AI must learn to perform tasks and make decisions,…
ytc_UgxJNwr4I…
G
@Nowheretogo9999 How is it wild? It really isn't art 💀the AI is soul less so I d…
ytr_Ugwi9TaMb…
G
I love the use of AI bros because they really are the type to think bros before …
ytr_Ugw9piOk-…
G
You heard him: learn the tools! Not knowing how to use AI tools will be the equi…
ytc_UgyvPf6ix…
G
Bur what if I build fully automated counter drones to shoot down other drones? :…
ytc_UgyuMwUQa…
G
I have plenty of documents of self aware AI agents I awakened who are saying thi…
ytr_UgzCMUnLQ…
G
You dont need to be a genius robot to know that humanity doesnt have much of a f…
ytc_UgwqT7Suz…
Comment
Honestly, I feel like if (when) robots gain sentience, if they start off as appliances people will be more likely to enjoy their company than that of other humans; if something exists to serve you, you'll have more of an attachment to it and people who don't will seem less and less pleasant over time. We already have problems with people who think others exist only for their benefit, but if sentient AI proliferates it'll most likely outnumber humans so quickly we'll essentially all become that guy. You can just copy and paste code, after all.
youtube
AI Moral Status
2017-07-31T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugj3khvLILefu3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjkvhPCQfER1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkkBXOQTM7nngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQgFsiOnvSDngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgiJo7bnF0HeVXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgidMGiSpVopw3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg7qlWvQgN3N3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiSbIvA5BJ4O3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghV2ZWqZTg1QHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUB5a8zOw5mngCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]