Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI and robots will eliminate many jobs, and the necessary workers will use robot…
ytc_Ugz94j5WO…
G
Does anyone know when they are going to revert these restrictive changes? My Cha…
rdc_nk2pc1g
G
Is this an actual company or is it satire? Yes I know I can do my own research b…
ytc_Ugx58p3BW…
G
him complaining about his "art" being stolen is so ironic and hypocritical cuz p…
ytc_UgzpfIRIW…
G
Love is a verb. How does AI show love in an action?
Wisdom comes from trauma.
Ho…
ytc_UgwJyGBjq…
G
the internet is crazy, one moment you see a video of a guy blending a cat then t…
ytc_UgwgkIE6R…
G
Fun fact; whis short film was partly why nations disallow automated tracking b…
ytc_UgwqlYL31…
G
That's a thoughtful take, and the video actually addresses this exact tension. Y…
ytr_UgzrYcxxD…
Comment
Peak example for why LLMs have no place in encyclopedic use cases. They're intrinsically prone to amalgamating their training data ("hallucinating"), as their responses are purely based on the probabilistic relatedness of its training texts to the input text and its syntax. They don't think, they don't problem-solve. They just give words that have high probability of following or relating to the sequence of words you input.
youtube
AI Harm Incident
2025-12-06T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzoLDifIt3aG_H5fkR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6qzQX67NVnFjaiFV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQkdgazW2JmfA-pOh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2Op_dlIVfnjjbJt14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz6oDn-c9iudLgk7mp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4yDwbmnCF-rOHUEt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwe2GpEtQphzk5mWqR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEL8p2VjBFS8Wl3Kx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmeuLX5hchAabtJRF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRhGSC9uJf9Y2W8NV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]