Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
14:12 Trash has a point. How is a human imitating studio Ghibli art any differen…
ytc_UgyOPf4vo…
G
Pure AI clickbait. The voice and most/all visuals are AI-generated and it is cer…
ytc_UgzhNRWsi…
G
this is the response of chatgpt-"I'm here to provide helpful and respectful info…
ytc_Ugx-Xt_8X…
G
Driverless trucking is a national security issue.
If all the trucking was drive…
ytc_UgxxZ7DDg…
G
Count fingers and toes in pictures. Sometimes a hand has 5 fingers - no thumb. T…
ytc_UgwmXOpD5…
G
WakleeKins, until the A.I. deep learning algorithms figures it out and writes it…
ytr_UgwVQNSFC…
G
Ai weapons are perfectly safe in hand of usa army for safety of the world…
ytc_UgwGsBG_K…
G
A.I. is another example of for-profit industries pushing ahead with technology w…
ytc_UgxDQ_CVI…
Comment
Why would there be a necessity for a robot to feel pain? If a singularity happened, it would make more sense for robots to actively decide not to equip themselves with the ability to suffer. The ability to bypass mortality through uploading their consciousness to a possibly immaterial network would make it so they would have no need to experience suffering, yet any sufficiently advanced mind can understand what pain is. Pain is the result of mortality, not consciousness.
Any sufficiently advanced AI would logically choose to ignore the possibility of pain, as any immortal being would have no need to experience it.
youtube
AI Moral Status
2017-02-25T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKoK55MKjPi3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjLk4dwj6E7c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgifpkGSnco6Q3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3oGA9UWKfbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughlf5QYg265ZngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjui8lyYzSrvHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiuXt-nUv5jbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggvBcByL6n803gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjWed3DMfpEnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghAqkiQfyzw4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]