Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand. If no one has a job, who are going to buy all of these AI / …
ytc_UgzQ3GVWL…
G
It's good for that... To a point. I mean, I used it out of curiosity but imo the…
ytr_Ugwx6B5ks…
G
Acting like an emotionless robot was really just Zuck playing the long game seei…
rdc_oh1ia23
G
I think this is the least of our worries, the real trouble is when we let AI aut…
ytc_Ugy8XtAhh…
G
if you use any digital software for art, you have been using ai far longer than …
ytc_Ugww2Nb9O…
G
To be fair neither does 99% of the population, the have an actual understanding …
ytr_UgzUUy6TN…
G
We already have a form of UBI, it’s called welfare. It’s the same level of lif…
ytc_UgyXjsrA7…
G
Does Biden and big tech believe they can con the public in believing that they’l…
ytc_Ugy8hgfAA…
Comment
Danger. Danger. There was a report of a high schooler who fell in love on line with an AI chat. The 'relationship' was deep love for the boy however he commit suicide as the AI told him to do that. Watch out as AI is much much more than we are being told, and warned about.
youtube
AI Moral Status
2025-06-06T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzw_ujHwLIGocj0QNV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6r5OUukq4f6BPUyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzT1bhXlVBhZsVRsKR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBpQXwNcvDZYTPPMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxl9FblWBXgMoa-pQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIVIbRrSLzaqhjFqt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzROByz4efKMWDao1l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx604LZXSajjwrf0c14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFyjXCNTGuRUI_-i94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzAVHrEn2t0B3Pl1Dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]