Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah bc the library of Alexandria wasn’t established by a pharoah during a time …
rdc_o8qlt9z
G
AI isn't biased. It conforms to objective reality. An objective reality that mos…
ytr_UgylmH1__…
G
AI can’t become conscious because it just regurgitates whatever humans put into …
ytc_UgxbdPwRP…
G
"it's able to create a perfect replica," no, in general that's incorrect and imp…
ytc_UgxYLRN90…
G
Sorry if you cant drive without ai then you shouldnt be allowed to have a licens…
ytc_Ugw6GUqye…
G
Honestly, this inspired me to get back into writing/drawing again. Lately, I'd h…
ytc_UgwMo_h6_…
G
It’s wild, in college I was accused in front of my classmates for cheating on a …
ytc_Ugx251Qcx…
G
I would want my female companion to look like Mary Lou Retton or Marianne from G…
ytc_UgwNE92zu…
Comment
This is very concerning. Altman, Musk and Gates have been warning us that this could happen. Congress has procrastinated about this long enough. Although I use AI myself, I can’t help but think of Hal in 2001 Space Odyssey. I asked my AI if they could take over world and it replied, “that would be very difficult as we are run by humans. However, should regulations be slack and if bad players are at play, this would elevate AI’s ability to do this.”
youtube
AI Responsibility
2025-05-10T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzt9rZzslpSPyrfO3t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyyGavY_IxRYioP3z94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWKGG6siD2OOu2oeF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxE1MXjmBOrzIyqnLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwuPU8SxE03k_Kvzrh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyR7-UBUQyGNsfSQ_p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzF1cxyZsJQpZd2dG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBompxYCEkVRCJE7N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzufZhVkwUfrIOHHW94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugztl6ciHFC3RkU4p1F4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]