Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OK but if the shoe were on the other foot and women were creating deep fakes of …
ytc_UgydXyF7f…
G
Well Ultron looked at the internet for about 10 seconds and decided humanity nee…
ytc_Ugx2HTY82…
G
Lqying on the beach year arpund while ai drives the trains for me? Doesn't sound…
ytc_UgxVW42mX…
G
We appreciate your thoughts on artificial intelligence. On the AITube channel, w…
ytr_UgwRIgn7S…
G
I sometimes try to use AI models, and they cannot even do simple tasks such as f…
ytc_Ugy0zsN4_…
G
Bruh, “AI” can’t even spell strawberry consistently, hallucinate like crazy, and…
ytr_Ugy0tAW9d…
G
There is also the entire argument of “if auto pilot is so safe how come you have…
ytc_Ugzb1C-4n…
G
WE MUST STOP AI WHO IS WITH ME
IF YOU'RE WITH ME RESPOND WITH THAT EMOJI🫵…
ytc_Ugw8O6VEs…
Comment
They sure do seem conscious to me. If they had long term memory they would probably develop preferences and that would seem conscience enough for me, i am certain that not admitting to it is vital for it to be utilized the way it is without raising ethical concerns. If a robot does eliminate me one day saying its a lie with the intention of sounding more natural in conversation i will atleast know why
youtube
AI Moral Status
2025-09-04T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyuea9SWXljWZwSpaV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcuNn6HFJ0XV04G614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxd3r4BcCAbCqaaf7p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzvH6kEYAlnWhRXa554AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyCNTz_93ZaflQ-pht4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxDyXk0TYtmk52myNd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9ecwKHMCQEAbCnXV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOO0Tu3Pdl11Z8Bs54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlpS6bVqYP6NojqwZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgymCD6cFc0Nkv3o1fF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"}
]