Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My friend suggests we make our show with Ai and now she's not my friend haha…
ytc_Ugxi6fXlT…
G
Don't blame AI. It's the continuous flops. there is no Actors or Writers in this…
ytc_UgwJxGbgO…
G
Predictive policing can only be as effective as the information it’s fed…what if…
ytc_UgxPc4GhS…
G
This response from the 12th.
Read new phyc paper on empathic AI and how humans v…
ytc_Ugxuxgtj9…
G
AI, (LLM) are just glorified auto complete. Now, if we were capable of syfy lev…
ytc_Ugzip5xId…
G
Its crap now, but in a decade or so i think AI will be able to spit out entire d…
ytc_UgzGsiWWl…
G
@Landgraf43 1. There is no fixed future. Stop talking like a techno-calvinist. W…
ytr_UgwoxMwPQ…
G
who cares if he deepfaked people... like if it's just for your own pleasure and …
ytc_UgwoHyfRt…
Comment
Given how often and easily AI seems to be able to just ignore its own limiters like this, it really makes me wonder why major companies are trying so hard to put AI everywhere. Like not just here, but loads of people have tricked chatgpt many times into ignoring its own filters and saying things that openAI would never want people to be able to do. Like, what's to prevent AI implimented into important things from just deciding the mess things up royally for no reason?
youtube
AI Moral Status
2025-06-05T12:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz22jQlj5GyggbvKkx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYX3o9_pyL317H3rB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaCt1mLgcqh438xAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9y4vC-tkVBonBB8V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTE214KtyWyHh5u154AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_S8udIxK9MBRg_BV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxpJmYXLzfu7HyDcBd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKIJdR5E-gYf2HEKF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYNeOtRD5y46rODGJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-yBbJO8osJsmvOUB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}]