Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's crazy. I remember when I was a kid and future predictions would always us…
rdc_euezw01
G
@PollenNote How who is wrong? Art snobs who say shit like "you don't understand …
ytr_UgxniQBZe…
G
I would straight up prefer the continuation of contemporary abstract art over th…
ytc_Ugwyw1yzx…
G
A close family member has been telling a popular AI program all her woes. She i…
ytc_UgyN6sSAU…
G
The problem with the likes of Altman and Musk, is that while they distract with …
ytc_Ugw17kZnM…
G
AI is an extension of humanity. Good sales pitch but sure hope he's right cause…
ytc_UgzPLUsJM…
G
Yeah, people who think that DARPA doesn't already have sweet recognition algorit…
rdc_ektnt31
G
Lots of things to consider, but my ears sure perked up when Ms. Luccioni talked …
ytc_UgzntRIIY…
Comment
Neil, given your previous on-the-record skepticism about AI-related dangers, I am very grateful to you for having Geoffrey on as a guest for this discussion. To the whole Star talk team: thank you for what you do!
youtube
AI Moral Status
2026-02-28T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwhq0kEidwBSXV_gXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw5USXloCU-0wltaE14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_sMYkpQVpPeroVJt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzbcWSSdnA6p07MZ1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgytO5jNoAUoz6gK5FB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSJB2aWzlO0lthMn54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfdGAEL0jjL4dFwCp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRLipJIIPbRxTjVLl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgylMwnaAES_OYOYWgR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyjqLUuVFMyRgBIEQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]