Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@nickmagrick7702 "A person who is very knowledgeable about or skilful in a parti…
ytr_Ugz7oZeFH…
G
There are some wild takes here.
The "lazy designer" is someone who WON'T use AI…
ytc_UgxeiCsV7…
G
Healthcare is the only field AI can’t claim yet. We still need live nurses and d…
ytc_Ugy8NIB9Z…
G
Not really. It cant install access points and a CEO cant just ask ChatGPT to mak…
ytr_UgxMcNK6i…
G
Spot on. The question isn’t really if AI can make art, it’s why it would.…
ytc_UgwzeYZ_F…
G
I don't necessarily think ai itself is bad, its basically a dumb child, it doesn…
ytc_UgxSXdeWD…
G
the only thing making us go instinct is how stupid people r
AI isnt even AI. its…
ytc_Ugy6esv2f…
G
That is pure evil. There is a big BUT. YESHUA is coming back. He will never leav…
ytc_UgxES9GGO…
Comment
Today's AI is what used to be called expert systems, but on a grand scale. There are no intelligence in them, instead they map patters. Patterns are very useful and when you control a lot of them it can look like something smart is going on, but no.
For AGI we have the problem that we really do not now how it work in our own brain, and do not now how we could construct something to do the same. Even if we could make a simulation it would be very slow for a long time.
youtube
AI Moral Status
2025-10-30T20:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxi_WQDxjBUM3DxMXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyubHlk3SYTc5ECco14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"sadness"},
{"id":"ytc_UgwIP0X6C2Uh3Db8qat4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXGNnOa3vEPzKtm814AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_UgzxM-ZpKZHHmQchi7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxdZC77W8Sk51DN1hl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0EPssorPnG-CUiWx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyL_J9maIR2t9Q5PMp4AaABAg","responsibility":"expert","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzxZWeZ9v_2i70bTUh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzP2ObOZA0ZLAXZoTB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]