Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is just stupid. I don’t want my robot to ape a human. I want it to clean an…
ytc_UgwfVPmbS…
G
I rather be in a self driving car 🚗 than my wife driving car 🚗😬💥🤕…
ytc_UgyQFJkju…
G
Companies will tell you “yea we had a couple mistakes but hey our ai is pretty c…
ytc_UgwooQ7Gd…
G
2:22 AI has certain tendencies that might be considered inappropriate. As humoro…
ytc_UgyapF2aB…
G
AI will disrupt the basis of capitalism consumer market, who will buy products i…
ytc_Ugz5a4OPu…
G
And proof were no where near ready....the Bible says in the last days... there w…
ytr_UgyPz7sPD…
G
you scared me for a sec there😅 I thought you were trying to defend ai, glad ur t…
ytr_Ugwfy1K34…
G
Health care services automated... Yeah I'm sorry I can't see that happening. Esp…
ytc_UgycaJhpI…
Comment
What a ridiculous premise, 2 bits of metal and electronics fashioned into a pretend person like structure discuss the FUTURE of HUMANITY, 2 concepts these pieces of plastic and metal know nothing about except the dictionary style meanings inputtedby humans ..I think there was a talking robot in a movie from 1977 called Star Wars, it was made to look real by tricks and a human voice reading script lines......or the same with Deep Blue the computer that played chess with Gary Kasparov in tge 90s, these things need humans to input what they want them to do...they don't have a brain that thinks,oh I know I should out think this person at chess...it is programmed that there's a finish line it needs to cross to effect it's functioning goal ...
youtube
AI Moral Status
2025-07-04T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxPaZcF_4SD4GZsZh54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxivLnt669j8yEGugh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAyDiF-PS18XTPAOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxy8wkG_50u5rBthQF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9hephc32bib5X3VR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyigZsSr0tnnsym6BB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5C1PKZJrkT61PrJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyQjJ9CncLMxbSETiN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbVGTWkKjjuWWw2R94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugx4ExF5QVViidItv5l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]