Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Cant wait for AI to take our jobs and force us to reconfigure society so as to f…
ytc_Ugzr91Dxt…
G
Wow, you went 5 minutes in an AI discussion without a Terminator reference. Not …
ytc_Ugh4mnc8i…
G
@asuyaTora c'est faux l'ia existe depuis min 10 ans sur LLM mais c'est juste exp…
ytr_UgywB_8El…
G
I only need AI to help me cancel my service without talking to an agent…
ytc_Ugwi5ANVv…
G
I think that AI made art probably wouldn’t be nearly as much hated if it wasn’t …
ytr_UgxPd7TK2…
G
Thank you for posting this! I’m in almost the same position as you, and writing …
ytc_UgzmFA6fp…
G
Well. Elon’s response at 9:20. That’s surreal. That’s a slap of reality …
ytc_UgyxggAQF…
G
tire strips. deflate tires. and get a Bubba's truck to pull it out of the way. s…
ytc_UgwT3HBH8…
Comment
AI researchers have a neoliberal bias: they conceive of intelligence as something individual. But, as innovation theories from the 1990s have already shown, intelligence is a collective phenomenon. What AI lacks to become AGI is the ability to understand when it should ask for more context (and when it shouldn't), and more importantly, when the missing context doesn't exist and must be defined through negotiation with the user. Missing context here also means undefined scope.
youtube
AI Responsibility
2025-10-26T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwGrjEBSDff3mJS1Xp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzt-0Ny5geqbG1VNmN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0PxcoFMikfAWUJEF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyXWh_3ZqyOZ3OA7jl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZdHFj3tdT5ZzwWd94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxm-0t2jsES6YOq91h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxKl4_xMMyy9svw3h4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyhfIkhYlQELOec2914AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHtMNaLErqJXzNwmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXyGV9CjQ3MLuorqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]