Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the Internet of Things is the ultimate way to self-driving because if ev…
ytc_UgyCoxHpv…
G
I think the main problem is that it doesnt work consistently. The more complex a…
ytr_UgwUjm9hT…
G
There's so many things said here that are just a complete misunderstanding of AI…
ytc_UgzVIinh1…
G
I also am loving the unbridled proliferation of chatgpt in the comments section.…
ytc_Ugwr4cyKn…
G
I still remember the first time I didn’t doodle to doodle, but wanted to draw a …
ytc_Ugx1pygQq…
G
Was looking for this. These jokers don’t even try with AI! 😂 AI is the red herri…
ytr_Ugzbrc6Jb…
G
One or 2 mistakes should automatically be banned not a good responsible decision…
ytc_Ugx4QH8ez…
G
We hope you enjoyed the video! Remember, on the AITube channel for subscribers, …
ytr_UgzxCljkA…
Comment
Am I the only AI professional that gets very annoyed by this guy? The jump from minute 46 and 49 is immense. It assumes AGI from the way he is talking about it. AI today doesn't think. An LLM is something where you give it an input, and it gives you an output. It's very advanced compression. The jumps he makes talking about self preservation is entertainment value and assumes technology that doesn't exist.
youtube
AI Moral Status
2026-03-02T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw_aEXTFogAnQ2YMMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytV1pB9MINc2dSpMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxirK7zMYMdyUSLAzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmTc702KrCMa97eUl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw0R-e1dSRDU2umLYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxpvyvIn7j1qgSg9Lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYnZAcijKqJ6uVF6t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvGmQ29xS0swi0S2B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzlJloebKr_q-5LDah4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8t3JtLkyvFanpHgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]