Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone who writes software I feel the same about the constant "suggestions" …
rdc_nm1j16e
G
People that think this thing is alive make me laugh. Ai models are stochastic pa…
ytc_UgwZ-LP7s…
G
DANGER OF AI TO MAKE THEM LIE AS THEIR PILE TO LENGTHEN IT LIKE A NILE FROM A MI…
ytr_UgxPw_P4j…
G
Everything these people are whining about is the same crap people whine about ev…
ytc_UgyqBKMh4…
G
Golden Garuda I think it’s one of those things were we make ai so smart that the…
ytr_UgwS-unyp…
G
I just knew it would be a matter of time. You ARE the responsable driver, theres…
ytc_Ugy9sM_k1…
G
If scientists can envision Boltzmann brain, then the concept of AI having consci…
ytc_Ugyf8V1Bz…
G
Short term profit trumps long term risk. OpenAI is no different no matter how th…
rdc_l4orss3
Comment
UBI is a pipe dream
ZEROOOOO governments will give it
It’s a dream, nobody buying their A.I products and they’ll go bankrupt taking investor money with them to the bottom, no jobs equals no money to buy
This utopia you speak of doesn’t exist as man always has greed and greed wins …always
youtube
2026-01-18T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw2PSXwpnegqz4aEq14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz3zHM5f39r9XkYIil4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXuKwTAvh19H3CzYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxL6fzPUS_pnC3w4kV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxtwuxt_pWGsmx853d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcMZYBAFzFEWd-ebN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWDT-Fl97W3X4mk254AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUG7V3cf4MRbvs1u54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy4lueBKOh85d-kUW14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxnycfajtiIIQGgzld4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]