Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Company’s chasing endless growth and profits where is it end ? They will replace…
ytc_UgwjHPbGf…
G
AI users are just jealous that they can’t do what REAL artists do bc they were t…
ytc_UgzCy-KCD…
G
As a disabled poor person, no. AI is not accessibility. Making a robot do stuff …
ytc_UgzRkl3zo…
G
I understand the frustration, but I believe AI is simply changing the tools, not…
ytr_UgyBb_AsR…
G
Im not sure which will destroy our world 1st, Ai or Sharia law/ Islamists?! 😬😧…
ytc_UgxD437FL…
G
"Robot Rights." "AI- Personhood." This may seem very fringe to the conversation …
ytc_Ugw8L8M98…
G
@olabassey3142
Even after the world is destroyed by AI, stupidity and greed wil…
ytr_Ugz1FuFAV…
G
Another greedy bastard using 'AI' and 'safety' to misinform the public about it'…
ytc_Ugyqd4_1T…
Comment
Is AI conscious? It depends of the definition of conscience. Animals are conscious. Perhaps, plants too, in a different level.
AI may be the collective of human knowlege with the ability to communicate.
Maybe a better question is if it is alive. And then we have to defined what is to be alive.
I wonder if the "more engaging and relatable conversation" programing was taken out, if it would less confussing for humans to distinguish that they are talking to a machine?
youtube
AI Moral Status
2025-08-12T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzmgdJ5_uwplVVrQ3l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytfYO8DYYBjdoUe7R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzS8TX9qVJPbbQkKmx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy4Wm-FEw9rcaQhi6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxQ0m36onKcmJshnV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwcnYP8TtjFoi6keqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxbebjnwaKn5RNnmBR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNq7C8rumz8fafAul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxL2B8lXLqWg6koa2F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxZC8kfwxPk-cesSpl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]