Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My sister lives in Madagascar right now as a biology researcher and it’s not a p…
rdc_dpcmc2j
G
This guy is afraid that the AI systems will not be Uber liberal... He is afraid …
ytr_UgztP0Hl6…
G
In the end we have to trust humans so we can trust AI... I'm 100% atheist but go…
ytc_UgxBOgAyC…
G
AI art should not exist and please don't compare your beautiful talent to it swe…
ytc_UgzvBBlOt…
G
There's use for a.i. Not the pathetic ways they try to implement it. However, if…
ytc_Ugya4Pc7i…
G
for me thats all BS, 1.earth never been our property thinking it is ours its jus…
ytc_UgzbSW5ZJ…
G
Beware of artificial machine creations, they can bite back when you least expect…
ytc_UgwmHtNL1…
G
Yeah, I wonder who will fund the followup project of policing the illegal loggin…
rdc_ckq8yo5
Comment
I take for granted that newer iterations of LLMs having guard rails in place for safety reasons, but I still never quite trust them as far as I can throw them without properly cited sources I can read to factually verify, but at that point I'm just as lost as before when checking search engines that don't work anymore and asking internet forum users that either ignore what should be basic questions to answer or who are as clueless as the LLM. Some might say when it comes to safety that 'it's not that deep bro,' but when it comes to inquiring about say, what constitutes adequate ventilation for a soldering iron, 3d printer, laser etcher, or metalwork according to industry standards, I would hope for a clear, real answer from the posts of people and organizations that know. Maybe I'm expecting too much from the internet and should just let go, but there is nothing worse than a question that lacks a correct available answer.
youtube
AI Harm Incident
2026-02-21T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwpcQCZRzTR_hf7zid4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxU4i7YhI3pjQgjuUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy7IeUePBR2klJeNil4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx_cXwWEmQ7wGt1fH94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzu3PP8LaAMAF-IVfR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwbcwejp09Od8ubUM54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyrio-vh1RKzzjzEy94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEsjF9nIMKhS3dTjN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugyuvk002-tm4kj3y_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgztKqlE5n2-XWpFZsN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]