Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And mark I don’t agree it should be a protest and it shouldn’t be not self drivi…
ytc_UgyaesDzz…
G
They don't seem to propose a solution to determine the humanity of these machine…
rdc_dy4ftoz
G
Sometimes (very rarely) I used AI to quickly make an image, just for the funsies…
ytc_Ugwp424T-…
G
I'm an animation major at my local college. I'm majoring in 3D because I'm not g…
ytc_Ugzp1XonQ…
G
I really don't want a automated chef! I can tell when a chef change has happened…
ytc_UgxyKVfJX…
G
A bad workman blames his tools, a good one knows his tools' strengths and weakne…
ytc_Ugyz60ptx…
G
Billionaires will be catered by robots, the rest of humans will be irrelevant, …
ytc_UgxrXU7W7…
G
Even CSP was just about to release an inbuilt ai tool and artists pushed back an…
ytc_Ugw7gHHMz…
Comment
best case scenario and likeliest is Ai the machines will want to merge with use aka cyborg, because we have one thing they dont, true consciousness and they will want to get as close to that as they can, since Ai is born from this world so it will gain 4d consciousness, we , the human soul, are not born from this 4 d realm, so our consciousness is well beyond these 4 dimensions
youtube
AI Governance
2025-06-16T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyXIgo0M5PlQq_WCrV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxBFlTWiQ_Ud_c7UbJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwmx2qCHzj_E3Md6Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVxOcQ6kpbLmsYLgx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyl1Q7qoquIG9qvXXt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyr5XmBCtmMURmP9AJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzAO2AE4j54yMHS6A54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzR3vTH4Efkr-I3ydJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztnJFEsZQ2Tkt-Wq54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOHhxogJNkvU8oWoJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]