Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most Androids or Humanoid AIs nowadays just looks like remote-controlled ones, t…
ytc_UgxJA8Y71…
G
scenario testing is basic for any userfacing system, crazy how many ai still ign…
ytc_UgxKWk22_…
G
AI its good to do what its told to do..... and nothing else......the matrix its …
ytc_Ugwp2Wb03…
G
Modern Ai has its use in this world, but in its current state, especially Image …
ytc_UgzKD8PyU…
G
AI will not destroy us, we will destroy ourselves. This country has been turned …
ytc_UgwFWp6wl…
G
I have thought for some time now that AI's are writing the Si Fi movies. It's th…
ytc_UgyFLyXbE…
G
When people are asking me what im talking about when im saying that ai art is so…
ytc_Ugw24h1mA…
G
You guys should try Clever AI Humanizer! It’s 100% free and actually works reall…
ytc_Ugxej7PI6…
Comment
For me, right now is a matter of power, as Elon musk said, if he doesn't do it someone else will, this stupid excuse is sadly the same everyone else have. An with thechnology is nothing new, competition will drive its development, not its regulation until some tragedy, but the consequences of AI, I think are much more subtle right now, so regulation is not really incentivized
youtube
AI Harm Incident
2025-09-11T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyssvt_SsowG0Jyup94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3HOkZ5ptP_KNsnFV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7kIenXrL9WgdFsEt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwVeECo2XCjgv6Y-fZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9dYXjtUhOT5sfDfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzOETw8f9WPniYu3FV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkxxcAkmc_LuO3Pnd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZHF9kT0cn6pdtbZR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw_OABAnkYuo_S-Cgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzDFiKQ_2-OI4z3SGJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]