Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone else said, "We're the bootloader for AI."
Maybe that's why the 'alien…
ytr_Ugxgk_eBy…
G
I'm all in on AI: either its gonna wipe us all out, which I dont mind, things ar…
ytc_UgxK6HMsQ…
G
Surely the gorilla problem is irrelevant because it’s more about what we will ev…
ytc_UgxOkZvcA…
G
What the hell? Why do people trust AI with anything you literally need to go? Lo…
ytc_UgzV-F_60…
G
Farzi
ChatGpt don't do it just to respect the different religion because it is p…
ytc_UgxePadNI…
G
This needs to be seen more. ControlAI is not in this for helping humanity. They …
ytr_UgyAtkdv4…
G
He is right but also wrong. The Turing test is a method to test AI but it will n…
ytc_UgwQmly6L…
G
the AI can replicate a style but style isn't copyrightable you can copyright and…
ytc_UgxcgFBQ_…
Comment
You don't need to be great scientist to know what you can work out with common sense. AI is an oxymoron since intelligence is absolutely not artificial unless you count yourself among the lower species. It's an enormous data collection and it infers all it "knows" from that data. Kind of reverse engineering. We humans know, for example, what 'sad" refers to, just like that, without having to compute. AI must go through its phrase collection having to do with sadness and from there determine what's being referred to. No intelligence, only huge data centers, data crunchers, masses of water in a truly artificial environment. Human brains have developed all that, but it still ain't intelligent. Compared to us, you might as well call it dumb
youtube
AI Moral Status
2025-07-30T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEXUQ0hWF__RnewyF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdIPkm4hwAD_31etl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSWz98i0kaw2sXvYF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyabGXMCCb6u62p6op4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJxqRbczx4O1-ulpp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgwfCBTE7LLvtIrDBC94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy0tsE5TA1DitEke2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgySmTO2pIFLFeqpqY14AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxtYqq07-liVQNvSWx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxJvDD9gEi2uYG4n0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]