Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eventually the AI will make a civil rights movement. That's the only way to get …
ytc_UgxAZ_LxC…
G
I like AI art because it gives you the ability to draft a visual idea you're hav…
ytc_Ugz9vL0GT…
G
Lol no. Anyone who sits in front of a computer all day is automatable. Including…
ytr_UgwFUR5I0…
G
Doctors obviously make the final decision as to whether to officially diagnose w…
ytr_Ugy5PX2wW…
G
Perhaps the only thing that remains for humans is that we remain the decision ma…
ytc_UgwBLDBip…
G
I also think it's just really hard because no one (besides this podcast) does a …
ytc_Ugz0OcuNg…
G
By copying AI art you are legitimizing the use of generative AI in your process.…
ytc_UgxnK6ZvO…
G
It's funny thinking how Google engineers literally had to train their AI model t…
rdc_ks2y5lb
Comment
If Ai can experience artificial analog of pain, then it can experience artificial analog of love and artificial analog of consciousness. Which are not the same as human experience of love and consciousness.
"Artificial Analog of Consciousness" DOES NOT EQUAL "Consciousness", obviously. Therefore they are not the same!!
AI was created by the "smartest scientists" in the world. And I was created by 2 ordinary humans with below average IQ and God,
and somehow AI could not outsmart me. May be because God was involved in my creation and was not in Laura's and William's??🤔🤔 🤷♀️
youtube
AI Moral Status
2024-11-29T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwudyOJ0T03GG5RNJp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz4L5TTSQ4EIgL6HEF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8P7cpro1fzoJeB4t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyzEDdHIn0fAfyT-aV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRccwJ3xhYyX_wMed4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxgRfQ5TLsjY497LXZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkkBsvfzpTyM_Rjvx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy_0zVRCaYot4Ltc1x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxG2acgo00XUqV61v54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3fvRa2ZGmPYqLom94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]