Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These self-driving cars should never be allowed on the roads. A lot of drivers s…
ytc_Ugx827I7q…
G
Sounds a lot like the Montessori middle school I went to. We had a teacher but s…
ytc_UgytV6wIL…
G
If AI is about "democratizing art", how come nearly all AI posts I see have all …
ytc_UgyEshRDz…
G
If future of jobs is us humans not actually doing valuable work anymore but rath…
ytc_Ugw4HizLm…
G
When you said that AI wrote that story the hair on the back of my neck stood up.…
ytc_UgxhJ4m5z…
G
@laurentiuvladutmanea i'd like to point out the biased language you used.
you s…
ytr_Ugw51FNt-…
G
The irony is that all these companies replacing humans with AI don’t understand …
ytc_UgzQkhKRO…
G
Dang I am really disappointed on how people are using ai to create a ghibli styl…
ytc_UgxmWeOU2…
Comment
I know virtually nothing about AI but I’ve never been concerned about it in and of itself and its application, it’s the human side of it that I think will always make it flawed. AI can only learn from what it researches and interprets from material created and presented by actual people, so even if it learns from another amalgamation of information it still has to be engendered from a human source. The other thing is AI cannot do anything or act in any manner from this information without human intervention, so it’s only as dangerous as the person who consumed and consequently takes further action from that overview.
youtube
AI Harm Incident
2026-04-07T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw94st37z9u5eKoSGd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPf8Y29nf3fcgFDrl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGTj0TuvB0PEI3pGB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzgjyg4b5klMLmPv194AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4HsBjgaRcV_MH7tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzekvc7UN9OPZcNA6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJFcYQHpdYH20IrDF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJ1WcQ8YQdi-hkgZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUXTu8j6i2NshJli14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGWlhS1HipDO4DsgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}
]