Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So if u are a driver in a self driving truck and the self driving truck causes a…
ytc_UgwwIqCTp…
G
Not gonna lie. I liked the original better than any of the others. AI for the w…
ytc_Ugw73BM9D…
G
@thomgizziz Tim was literally defending OpenAI and their corrupt, criminal pract…
ytr_UgxDe_hvz…
G
AI does not stop you from creating art. AI limits the money you make with it. Ma…
ytr_UgwQc71h2…
G
Just a thought. If people are let down by the long terms effects of A.I then wha…
ytc_Ugz-RzSyI…
G
This is me hearing the truth from a software content developer who knows the tru…
ytc_UgzJCb7wY…
G
> "AI is not true intelligence — it's a reflection of human thought, built on ou…
ytc_UgxLRD2xc…
G
“It knows more than you do.” No it absorbs, combs through and produces data poin…
ytc_UgxJ27BfL…
Comment
Does ai think when it doesn't have a question or problem it's trying to answer? Have they developed a virtual world for the AI to spend time in?
youtube
AI Moral Status
2026-03-01T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyTZLxAjX1JOqSFKDN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx7giDTBzm2AYgniCp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxFPtalflIaRL05154AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuJMienFlrXjaU8nl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyM1ZcmRyj_5pdZ1wN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypIAMe5PrSMvl72uR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugys-eq6oFVODIyHltB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy86lMQFFGzPrqH6FN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw-RBqYdE27O3S0q1B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMVjuIsaEumzNd4s14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}]