Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You don't know what the automated trucks will do. When so.e of the sensor's fai…
ytr_UgwRLi69X…
G
If anything I learn even more because of AI. I skip the fillers,ads and nonsense…
ytc_UgzjjSrAl…
G
I honestly don't know for sure, but I suspect it's because of AI and the extreme…
rdc_oi1wzbx
G
Tell them to get rid of the thousands of ai slop ads the run before videos too. …
ytc_Ugzn5bpDQ…
G
If AI take over all the jobs then. Who will have money to pay for service and ot…
ytc_UgyuSmgCR…
G
I thought doomsday would be just civilians, militias, military and drones but n…
ytc_Ugx2A2U03…
G
There is no climate crisis. Now coal is making s comeback in Europe too after th…
ytc_UgwWpkEzm…
G
That makes perfect sense though, LLMs aren't able to understand what words reall…
rdc_jipw0oi
Comment
The question is why would we create robots with the power to feel in such a way that we humans would feel our very existance was in danger? Humans and animals already exist, and thus we create rights for ourselves and other organism like animals. But sentient AI doesn't exist yet. If you don't want to answer the messy question, don't create the conditions for the problem to arise in the first place.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj-GG9HRZn1i3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBGDS4uJuvI3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiqMqcGlJ3kTXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggBfEyjI40hpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiq0IRMB0CCh3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh-w9BtvkjcungCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugjqh3cjpA79VXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj0_G0fNIn_JngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj2N050ddsTSHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgizsfMfud5iSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]