Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tbf... if they blocked the intersection *AFTER* the taxi made the turn, then it …
ytc_Ugz246sOS…
G
Damn I wish it used actual questions for the research. Chatgpt is ass in math, i…
rdc_jskutj8
G
I saw a demonstration by an AI company a few years ago and they showed how they …
ytc_Ugyh4pIii…
G
I wonder if chatGPT told them about the "blue blood of artistry". It feels to me…
ytc_UgxA2sber…
G
If you have no idea on what to do u could just use chatgpt for the idea and copy…
ytc_UgxHzgxK5…
G
It worried me when he was talking about encouraging the public to partake in the…
ytc_UgwcDA_q5…
G
Oh Shit they are fucking with us already! I don't like the guy robot he is upto …
ytc_Ugx13mxDU…
G
Can you call it artificial INTELLIGENCE if it is just a basic algorithm that is …
ytc_Ugyq2sIWK…
Comment
By giving a robot the ability to feel pain, you are essentially MAKING IT BE IN PAIN. Why wouldn't that be against the law to begin with?
On the other hand, if a robot's purpose was to serve humanity, a concept of pain would be crucial to completing that purpose. If the robot were destroyed, it would no longer be able to serve humanity, so therefor it would need to stay alive, which means it would need something to warn it against things damaging to that goal; something like pain.
youtube
AI Moral Status
2017-02-23T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggBwt39ne95NHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi3tBoCXCry5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiV96vmAVd6m3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg3Gwx2PlKLe3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghOb-FChO3vGHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UghwSl5bL0NorHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UggIIAf5apT5mngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjXCxJaU4DN1ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ughqt-XlMSOrZngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjOC6cVNxU5N3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]