Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha, I can see why you'd think that! Sophia definitely has some traits that mak…
ytr_UgzM7XPF5…
G
It's ridiculous how much people make anything ai out to be evil. It's just a too…
ytc_Ugw3gLPCs…
G
Why do you think people are blind? Although...you know, Instagram is already ful…
ytr_UgwWyzwEN…
G
They need to stop while they can we dont need human looking robots bro. Nothing …
ytc_UgwdJQr29…
G
They’re acting like the only reason people hate ai is because it saves too much …
ytc_UgxGdC3Jn…
G
Sort of.
If I hire an artist to draw me a mickey mouse for my personal use, the…
rdc_nhzcsnh
G
If the people teaching AI doesn’t know how to teach morals , where will AI learn…
ytc_Ugy7Dj0Pz…
G
Too bad they can’t build something that runs on Cheetos and Pepsi like the human…
rdc_lp6uihp
Comment
@MasonRamon-n6c My point was AI kills humans because "our existence is bad" but AI never seems to question its own purpose or existence. Why would AI take over the world anyway? What's it's reason?
youtube
AI Governance
2023-08-06T11:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxHWrUL9lol7MUX0TR4AaABAg.9sYlZkbt9859t3d5hqJv7P","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwLUAk7yEtlZ05tSMt4AaABAg.9sVHeH1_W0E9sWREmmMZit","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxC4aJAQTgPcCZ9nm54AaABAg.9sUXpw_ltfo9sU_3mtSHro","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzA9jS19YrqkIKh0zR4AaABAg.9sUV95gIv_C9tr9VRSa_H4","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzA9jS19YrqkIKh0zR4AaABAg.9sUV95gIv_C9uqrXcZHNGK","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgymFehxVrhhcR22pQV4AaABAg.9sGt8JnmNIG9t3cQ5JkGJv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgymFehxVrhhcR22pQV4AaABAg.9sGt8JnmNIG9t3yfha1ONm","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyJISfd-Pk4g4Rlggh4AaABAg.9sEZul8t6pT9sGBanIQ5th","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzLNtXBLg8PvxVLOp14AaABAg.9sETTTHnDEm9sGCG9B4SRm","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz-sFbHGzKZNyF6Og14AaABAg.9s97r8rTeLw9srZ64xSOfN","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]