Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is AI will take jobs of fashion designers???pls tell me if any one have any thou…
ytc_UgwrYU7gE…
G
Lucky for you Stable Diffusion is both free and open source. So make your own ve…
ytr_UgzmiRGIN…
G
Finally Google does something to protect us from AI... this cheese burger would …
ytc_UgwvN4kJE…
G
no. the ai didnt know about the cryptic message of "coming home" for suicide. an…
ytc_UgykVLsTC…
G
Shit I want an art job so bad idk what I would do if it’s all ai. Maybe I’ll jus…
ytc_UgyqrCF5L…
G
Im german law (Art 5 I GG) there are three theories for definitions of art:
1.…
ytc_UgwKQbA8j…
G
Suckerberg, Altman, and Musk all speak with the same ticks, believing they look …
ytc_UgwJbwuQC…
G
First you have to understand the ocean level isn't the same everywhere. There is…
rdc_d2yym9i
Comment
Considering how we can only truly know our own consciousness and infer it in others, is there any meaningful ethical difference between pretending at consciousness and consciousness itself? If we can't tell the difference, shouldn't we treat AI with the same respect (or lack thereof) that we treat fellow humans? How we treat AI that acts human tells us a lot about how we can dehumanize people who are already here.
youtube
AI Moral Status
2023-09-04T14:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwuvPMvW1Yd5WAkGBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3gnCoP-xn7-S94MF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCEPREMTYhaJWhC5J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwk6wTlq55JmN6yt8t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn1WdFZU03E9Dt6NN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBaL9lMw5ttOIWXfJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz2p_H50QBytY42knB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwq61-e9i9c3mZ6Hy54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_tLfNlBoYL7u6GuB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzshgq8A2AcZcL7Ftt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"}
]