Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Id be shocked if it doesn't happen by next year is a threat to the grok employee…
ytc_UgwnCGn9K…
G
@FloridaTrawler dude, they are egocentric people who just want the AI to tell th…
ytr_UgzKlu6fH…
G
No thanks. Don’t believe it will be used for good, only evil. You can make it fu…
ytc_UgwCyXFWQ…
G
Sad thing is, we dont even need the uses of generative ai 🤷♂️🤷♂️ its just for …
ytc_Ugx3jclus…
G
@BrownBrown270
Sam Altman CEO Project Stargate and not too hard to see where/wh…
ytr_UgwFS1rQP…
G
you can have a temporary career as a plumber whilst robotics catches up with AI…
ytc_Ugyc-Lc6z…
G
Yes. Some of those does not sound implausible. To be exact, AI going beyond huma…
ytc_UgzWFHzHG…
G
The emergence of the various forms of intelligence underscores the fact that men…
ytc_UgxRlzu5y…
Comment
The fact that this came out the same day as Eddy Burback's video about making good on ChatGPT's encouragement of delusions is frankly serendipitous
youtube
AI Moral Status
2025-10-31T18:4…
♥ 199
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyWBa3ZHDwbz_TRHOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNbgCru6frOF9-ROh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyO4sXzepX4g416NsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybbdJWEPoe2zim-2N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxW3DXpKt6efbyIVYB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrRoqpV7tc0mZ3gYl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQYarHiV2n7AAXJIV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwdbh1v_HwUmapK114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLBpOhTu1anLR5e0h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxaXODyoIy1XtKSU-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]