Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro watching ai made with ai talking about ai taking humans job who made ai to f…
ytc_UgyPh0UIe…
G
Shut up, half of yall or all of yall r running to this website to have yourself …
ytc_UgwkEgeZE…
G
AI is intelligence without empathy, just like a psychopath. People intend to all…
ytc_UgwhgNfVl…
G
In the end we Will discover that consciousness was a physical property of all ma…
rdc_icgltrw
G
AI is literally the modern equivalent of the nuclear bomb being developed. We ar…
ytc_UgyzV9RkM…
G
It's all our assumption that may be true or not , AI believes it and creates…
ytc_Ugy-QL4XY…
G
I hate the stupid answer of "learn a trade" when it comes to AI taking jobs... w…
ytc_UgyU3-n3K…
G
Achha ji ? 😂😂😂 ek web3 platform bana k dikau mern stack pe full front end back e…
ytc_UgxCF9JFQ…
Comment
We are creating robots to talk and behave like our partners of life because we want to feel their presence and talk to them after their death. There is an episode of "The Story of God" with Morgan Freeman about it. The robot's name is Bina48.
youtube
AI Moral Status
2017-02-28T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghVriokmiBrdXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggjMob2djzkEHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggJr8-UN-xM-ngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UghhNDhzWUUiOngCoAEC","responsibility":"government","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugj9myDUs7y-zngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjfweSgo8G6r3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjXivWrKkGxu3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghzKagSWsoOAHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Uggj1y11qcrSHHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggaLH0Jy1BVU3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]