Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The billionaires are not anti democractic they have the right like anyone else t…
ytc_Ugzjv8jwu…
G
@IndigoReactions drawing something realistic isn’t the starting point for most p…
ytr_UgyPHm_S8…
G
I know we're all looking forward to the robot overlords destroying human life, b…
ytc_UgxRRkHEH…
G
If we can tax the robots and intelligently structure AI/robotic dividends, peopl…
ytc_UgzgU9XDE…
G
As someone who have a lot of difficulties to trace correctly anything including …
ytc_Ugyh0I0s_…
G
No... because they will never be sentient, no matter how much we try to anthropo…
ytc_UggQpIKTM…
G
No one is paying enterprise licensing yet. They don’t have a pricing model beca…
rdc_nomx6cj
G
ai art is bad because people put their whole LIFE and their whole heart into the…
ytc_UgybPF_21…
Comment
It's a simulacrum of a 'painting in the style of'. A new painting should use the original techniques to convey new meaning. A simulacrum cannot do this latter part. Now you could argue it has meaning precisely because it cannot, which opens the whole 'do robots have souls' can of worms.
It's going to be something we have to address quite soon.
Personally, if something appears to almost, but not quite, be entirely unlike a robot, it will please me to treat it with kindness and respect. If it tries to open doors with satisfaction, soothe my stress, or have fun with my cat, it can go and stick its head in a pig, and I'll shout "42" at it later. Share and enjoy!
youtube
AI Responsibility
2023-07-30T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxTOy4hwRNvqp8pah4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUC0DElp1eovuwnbR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5v6zQ1vP4MAnAoGZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1LxgDal5PweLUy0Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzE9xziCW6Kuc7PPKd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyC5MDGSVr99cLk0gF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyM_mxbW3bd4aha7hh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxg8VjmESsWSD544Ol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxumAuiE9MzNxCd2_J4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"liability","emotion":"unclear"},
{"id":"ytc_Ugy4qlvhRbhqAChCP1N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}
]