Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I been learning about AI in many many years ..and AI problems aren't solved by s…
ytc_UgwQvAJo4…
G
Microsoft Windows screwed up big time when they layoff human developers and repl…
ytc_UgzJKO81j…
G
I think every human on Earth agrees that we want humans to do every job, not AI,…
ytc_UgzUF6uK_…
G
Too late Bernard. Not just the working class, AI is going to wipe out Humanity.…
ytc_UgyoOM7NK…
G
They didn't even discuss fully autonomous military drones, and how these would c…
ytr_Ugz-I_5z2…
G
ChatGPT is meant to be a yes man at you. A therapist is meant to do nearly the d…
ytc_UgzY-outV…
G
at this point id rather someone trace art than make AI art.
(still dont do eit…
ytc_UgxJCZGqo…
G
It's also not a good feeling, to stand at a convention and having an odd feeling…
ytc_UgwzkcOKs…
Comment
It's not just the AI as the people using it are the ones that develop it
youtube
AI Responsibility
2025-11-01T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgfOKS46CmPI5z06J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy89sKCfpInLZuDaNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkerL5K5ZX8mc2Y0d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOH9gjxqB8N1GFXVB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7OVTwmYMXmwEFqD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytMnvOlcNVPtXFzvR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyW2suBp23D5QhrW694AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyI9rE9nDSGuyGY7fF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxilt6Ioq70I_ByjfF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCmtagD-QyNv7byb14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]