Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sambucha mentioned that Animals are not Optimal for AI, is a False Flag Claim. I…
ytr_UgxQ2h3Ar…
G
Ai missed a lot. The part part about being the son of God. The father son and th…
ytc_Ugxwiw7WG…
G
And another reason to not ID yourself to police. Whether or not if you ‘have any…
ytc_Ugx5oHUIZ…
G
Considering:
-Evolutionary Intelligence
Natural-
...
Artificial-
Alienating-
a…
ytc_Ugy6Qbnop…
G
Time Stamp 4:20
A Decentralized system is the only way to allow vehicles to com…
ytc_UgzPiDD_e…
G
Who invested in these? I would assume each vehicle is worth 150k since it has dr…
ytc_UgwcxPdKM…
G
Ask him about maths and AI. It is a waste of energy to question him on topics th…
ytc_UgzblSEN4…
G
AI isn’t super great at creative tasks compared to experts and the real big issu…
ytc_UgxNMlQco…
Comment
Neil is full of crap when it comes to AI. But undoubtedly people who don't know what an appeal to authority fallacy is will mindlessly accept what he says and call me the idiot. In Dutch we have a saying for what Neil is doing here, it goes "Schoenmaker, blijf bij je leest". Neil should take that advice to heart. The entirety of his overgeneralized and completely irrelevant rant about exponential technological growth, how people were being doomers about computers, what he personally wants a computer to do and how that supposedly relates to the threats of AI/AGI, induced the same reaction in me as this goober answering science questions at the start of the video did to him...
youtube
AI Moral Status
2025-07-28T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDtimEa5ym9QXPfWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhSfpqcuO7xldq7mR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypLbbZdjy5qOg8LVh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzL7bH3nBqLznQLh2p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzkJWH7vnyd6mRG9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyFTw5AjUUsL66KQEh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxUTfy1lbt1wk3WWKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxuFW0zW4M6DsXAqox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy4bc09jDzu5Nc5OyJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzjX25EgPyXSMo0lKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]