Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why not make a AI C-Suite? How much is wasted on human carbon based unit execs?…
ytc_UgwFnz4OS…
G
Experts agree jobs will change, but they don’t expect a complete disappearance o…
ytc_UgzEWxanE…
G
This is just the. Nick Bostrom book, SuperIntelligence, in the form of a news re…
ytc_Ugz_AJnCs…
G
There actually is something additional on this. Another part of the AI is traine…
ytc_Ugzxe2Ymq…
G
175'000 dollars
Elle est magnifique et belle le style suédoise
Tout ce que j aim…
ytr_UgxNh1k5-…
G
Based on that comment, if I were a writer, I would absolutely quit and go find a…
ytc_UgymNlvR9…
G
Yall be saying all this but a robot was first place in a marathon with both huma…
ytc_UgxexAEXv…
G
I think AI art is fine for some things. Specifically, personal use. I’m in a dnd…
ytc_Ugw7aKoXT…
Comment
I think all these cases that apply responsibility for others for this sort of thing, even beyond chatgpt, are nonsense USA litigation and removal of personal responsibility. If you're delusional and take action based on your delusions then that's your fault and the recent spring of cases shirking this responsibility onto parties that supported the delusions disregards the individual's responsibility and power in listening to them.
I think this is absolute bullshit and should not reduce technology for everyone with guardrails because others are too fragile to deal with knives.
youtube
AI Harm Incident
2025-11-07T20:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy6b8F1FI5S63ISTs54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwgZmXTdMVNZDh95t14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdLqRlHefU4gsppA54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxXEgM6_IqnVsemvjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzDUavGL__FdNMQd0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIkPeK6bOxsGQdw8J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwMzuDyPuG3Q6__ai14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxp77jKXIBFNu2rmhB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwr--PlSRQfR6EOUr94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwqcrRjlNSTRTQ5YVV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]