Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI 'Artists' are just stupid kids. They don't understand how hard it is to learn…
ytc_UgzHp1FYg…
G
There was a primary study done in Detroit in 1980s as automakers wanted to repla…
ytc_UgzS35xQy…
G
@aferalcat9732 AI generators do not collage, there is no image data stored in a …
ytr_UgxBifI8u…
G
Good to see more people doing this as well.
I have a funny but long story about…
ytc_UgwP9C1wt…
G
forcibly AI and collect data as much as possible from virtually anyone and simul…
ytr_UgyyRmHU9…
G
@derfu55l well of course ai doesn't understand it like we do at the moment. I'm …
ytr_Ugy1-SiZc…
G
Nice tool but change the title of the video... This fixes NONE of the issues we …
ytc_Ugzz4uCa9…
G
If robots manage to do all our work. Would we still be useful? Will robots still…
ytc_UgxJ36o1s…
Comment
You just Quote the Gartner hype cycle and assume this is the same with some anecdotal stories and vague nonsense. That MIT study was very narrow in what it considered a success. It was basically written to buy put options on AI companies a day before the study is released. No substance to this video.
youtube
AI Responsibility
2025-09-30T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGb_oz0jGAeyBGjG94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNzFu5Q4M6DfttCiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynEWXhemTaRvIQJYN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzpg6lZD9nugDb3zOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYHfS1wPeAqyrsySp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1BJaJiKdne3kqW7R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0loppRR5yYLvcH0F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyIdh_ixQDBHXgRzG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxp24aVZQr3RQfNUNF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyH37IeTdQc41dfyux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]