Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People have always treated humanoid robots as if they are human, so AI that sou…
ytc_UgzxDmRfK…
G
"While large language model can never beat searching for keywords in searches su…
ytr_UgxZjZCDo…
G
Ai will never take the world the most dangerous technology is coming ,know as qu…
ytc_UgzciRtJD…
G
Soon schools and universities will close and all education will be carried out o…
ytc_UgztnWlCG…
G
The Economist: Immediate responsibility for this mayhem lies with Viktor Yanukov…
rdc_cfkrg1l
G
This would've made me excited to go to school! 12-14 years of our lives wasted t…
ytc_UgzjDLe4h…
G
Flippy, roomba, yardo, self driving cars, delivery ai, dish washer, washer and d…
ytr_Ugz-Serbp…
G
Here is the fun part. AI will also replace all the CEOs, CTPs, management offici…
ytc_UgwxOItMw…
Comment
I feel like this and the related topics have been discussed quite extensively in SciFi media. Do Machines have emotions, do Machines want emotions, do machines want anything or are they indifferent to it all whether it's being destroyed, not destroyed, doing something or nothing. How would the interactions between a sentient machine and people look like etc etc etc.
For completion sake: substitute machine with AI, Robot or whatever.
youtube
AI Moral Status
2017-02-23T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiyjzCTc8g_oXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgiKV5roAM8drngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghulkD-qy2L3HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Uggnize15yoAyHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjOlPQd5Ca5sHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggcGK52nAlrHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjJbiJBPUbWdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi26oYgcaYTAHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiPFrZsBn3iMXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg9RewNiCIchXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]