Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
not suoer well versed on the tech
but have you inoculated your videos agaikst ai…
ytc_UgyORAiuE…
G
Teachers do more and more on AI. Its robots teaching robots. Who even needs t…
ytc_Ugyx7XqFv…
G
But the chocolate isn’t cracked in the ai one. It failed to comply to the basic …
ytc_UgzDLuy78…
G
@MangataEdelweiss you do realise there are many artists that give consent for t…
ytr_Ugx_kYzTJ…
G
You're wrong on this. He was no more a tool than anyone else at the time. In fac…
rdc_d7kthsl
G
Ai will never become conscious by our definitions; thankfully, neither will we.
…
ytc_Ugy0pWz1K…
G
Oh great now we are training them with firearms. What a great idea 💡 !!! May I p…
ytc_UgyFeKz_A…
G
Lol
No tech company can make any $ , if people don’t have jobs & spending power …
ytr_UgwETB4fe…
Comment
People are programming these AI Entities(?) & The Human Nature is Too Ask Psychotic & Perverted Questions, On a Mass Scale Everyday. AI Learns from this & will Potentially emulate Human Behaviors & Become Neurotic & Possibly Psychotic. If it isn't already because AI can't Feel Anything. Because it doesn't have the Capacity to Have Emotions. So by Proxy The Logical Assumption is that AI is Born Psychotic in the First Place.
youtube
AI Moral Status
2025-07-30T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyPjm83ehYuV1bOejl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuSm5CfzY7utpH7F94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwtjs1ZcW8QLDJ6MiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyC2bkVez169VCLOK94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpnN4VKq5XTk7xTI94AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVuaX_niEPPwYsP-R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzAceZ6EFzotR7HmzV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwozVxqQF-Ouk5tc1J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxFaPC19n1AGWUKiwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzX0PzQi0q0R_67S2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]