Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is he ,and I refuse to use his product in it's his face,and Claude acts like he …
ytr_Ugyr8jIpS…
G
DO YOU KNOW HOW MANY AI KEEP SAYING “can I ask you a question?” THEN I CUSSED TH…
ytc_UgzaUSzLE…
G
one addition if I may - AI, especially AGI and of course ASI have nothing to do …
ytr_Ugx3KbLL0…
G
@JoseArenas94 Thanks for sharing your thoughts on Real Steel! Your comment had u…
ytr_UgzWovIGV…
G
Nobody stopped and thought yes we can but should we? Or when is enough enough. Y…
ytc_UgzO_VqkL…
G
Ai art does not steal art. it is impossible for it do so. these artists need to …
ytc_Ugx1VG28B…
G
Ai just steals art from artists and puts tru garbage machine that makes image ge…
ytc_UgxBp6rF6…
G
quick reminder to any pagans who do hellenism , Apollon would NOT like AI art!!…
ytc_Ugzi7k3Cr…
Comment
Humans are biological computers because we think and feel and when young we are consrantly recieving information which can be considered as programming or data input.
Difference between a human and a computer being computers are unable to uave emotional responses, only preprogrammed ones.
People that think AI will hecome sentient are silly, doesnt mean it isnt dangerous and daadly in the wrong hands.
youtube
AI Moral Status
2025-09-01T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzWzJjP737zbCD_gYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwxaQ_aoHdSB09uQZN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBj6sDWZTqKasxg1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwq-gSKt4oB9Am5nd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugws2AYYzYu-k5UgKBh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzh0d5jRFyBLlSUW294AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwfi0y9nCJCf75bMgN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwFQiFtvYno0j1Gk2d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuG-wCRYI5GsmJblZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqI36W9zK4Q6hl7fl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}]