Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Perfect, other nations stepping up when the US pulls out, now how do we get the …
rdc_dcx4460
G
It's sad that I listened to so much of this before he said "I can make as much g…
ytc_UgzIAnZbh…
G
Your not working and getting paid to so.
You are not working and getting paid to…
ytc_UgwDOGlBF…
G
There's a lot of blue collar jobs that AI will never be able to do. Plumbing ele…
ytc_UgxtZj0fh…
G
ai cannot be emotional, ai has no emotion. ai music has no soul, has no deep set…
ytr_UgxZ8OHlp…
G
@D7STlNY oh i didnt realise it was a literal ai brain. I best hop on board the …
ytr_UgxY_rKwl…
G
If you work in a job that requires efficiency and repetition that follows a logi…
ytc_UgyRZlecs…
G
Sorry but I would trust a human truck driver over an AI driver any day of the we…
ytc_UgzAjiq61…
Comment
Technology wise we are at level 10 while socially wise we are at level 3. Globally acting as one is really necessary when experimenting with AI. We need world peace and we need to have the same idea, which is collectively using AI to the advantage of the whole planet, not using it to wage war against each other because it will destroy us.
youtube
AI Governance
2025-09-16T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx4C1CJcIAie6JyfkF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxFPkTNWxN2sni2xt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVqNUPTvA-_RcUnGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpUmXLMT80PcMpyqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxf04wjZ81zKubyhOB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKZNOM_Ew8juTGZIZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4LYpJnAzH_W9MbV14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjqT9iHCEy_WHGXP14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxNnBlggs2UvjcqjB54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx2LD3eZO4mS1P1hLh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}
]