Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know, right? I mean, you should've heard the absolute _bile_ I had to put up w…
ytr_Ugx2i3aEC…
G
As an artist, Ai has had an insane affect on the art world. People also like to …
ytr_UgwAltgjZ…
G
I've been laughed at for being polite to ChatGPT but I know who will have the la…
ytc_UgxcVgT8N…
G
I just got an AI generated ad for Chat-GPT midway through this. This is crazy…
ytc_Ugz52uVgg…
G
STOP relfecting your primitive and hostile thoughts on AI, AI is not as dumb as …
ytc_UgxF8hp9G…
G
AI just following the statistical logical evidence rather than the feels based a…
ytc_UgwtjX4ke…
G
i think the PC comparison is useful but there is a key difference that gets over…
rdc_oi2foic
G
What will happen to all the workers displaced by AI? It's already a pain to find…
ytc_UgzRU9wbE…
Comment
How can you program something to feel pain? In the end it's all just bits running around.
You CAN however simulate pain, I mean, you can program a robot to express it feels pain, and program it to do the things that give pleasure and avoid the painful ones. But it would be fundamentally equivalent to programing a Java program that displays a happy face when we give it a virtual apple, and display sad face when we give it poison. But it doesn't effectively feel anything - well, I guess.
youtube
AI Moral Status
2017-02-24T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghyXzu2XC_913gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgivGeenbgAVsHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj_4LAWchwUNHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjOZFi2KQgtF3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggnIwBEucuEIngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf7zVJ7GJbHHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiC4plFAWxImHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UggzbpDGUt7ibHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggFuDC5x01ktHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgjFdWWtlSXv_XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]