Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi Dave. I like your work and your insistence on epistemic hygiene, which is why…
ytc_Ugxs-tsOA…
G
well wait. if we are made in Gods image, and ai is made in our image, then would…
ytc_UgxWelTOB…
G
I totally understand your concerns. The rapid advancements in AI can feel overwh…
ytr_UgykDrhFH…
G
A great thing about autopilot is that it can do sudden switches into another lan…
ytc_UgxTvoULI…
G
Non of this is surprising no one, let alone AI researchers, as a metter of fact …
ytc_UgwRWVZVA…
G
Ryan Low signals take time to send, recieve, process, and respond to. these cars…
ytr_UghSFLJx8…
G
Dumb humans decide they should have the final say in decision made by super inte…
rdc_gd9m13f
G
Dark forest hypothesis... If ya know ya know. Maybe Ai even runs the galactic sh…
ytc_UgxkwlJLh…
Comment
I'm sorry, but does Ai have emotions? Does AI have to feed it's 3 month old baby? Paying AI just sounds like a way of screwing over the working class/poor. Did anybody vote to be replaced by a higher intelligent species? What about the ethics of ramming this down the throats of everybody on this planet as long as we're talking ethics? It's a joyride for you guys, but that ain't going to be the vast majority of this world's experience. The dystopian 15-minute city is sounding pretty good after listening to you guys.
youtube
2026-02-10T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0D0BqJIoJtPN-bnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc-5WPzZ2MsuWxCwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy66fS1HNyCAt1z7IJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNDYe2N9T01_JR3nx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxW3CtlfcG03TqcNjl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPB46nHqsrKhz9Exp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwz7iNk2pAlvbEqOH94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyDvYV5JmdWL8oLdJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWZqpHvcU6aCVM3zN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDyfj2iqMlQclHBDd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]