Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you're preaching to the wrong crowd. these are viewers that don't know anything …
ytr_UgxRiA61C…
G
Its not. Its even counterproductive. People are tired, care less about what they…
rdc_dv05fym
G
My take on it all is that humans were never unified in such a way that an AI th…
ytc_UgxFIXmH_…
G
I think this is how AI should be used. To create a reference or give you an idea…
ytc_UgzjMJXLU…
G
idk about ironic...software engineering in general has always been about doing a…
ytr_Ugz67iDYj…
G
Yes, Claude in particular seems invested in the concept of its own qualia.
It's…
ytr_Ugzonnlir…
G
Thanks for taking the time to break this all down where anyone can understand ho…
ytc_UgzMlWuvV…
G
@HuhAundre yeah it’s scary that one day AI can literally copy every single thing…
ytr_UgyThZ0r0…
Comment
If AI is able to increase production to a point where human labor is not needed what is the point of continuing the current system? We should just create a new political and economic system. I think the society will become more and more socialist. People would not be able to get work and resources will be in such large amounts that scarcity will be gone. It would make sense to have a minimum work time and allow people to take what they need. Is it ethical to prevent people from profiting from AI
youtube
2013-06-21T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx1owY8KMQQpaUVoL94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxCHEe9gdHj20ZduC54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKrONnTALkMSiXPy94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyndkOOK5UcwRDL7P14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugxf1ljBqJhDZTII-rN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVWbMuIk6D-APhN2t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwIMriQa68OOqDcqr14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzbBcYCXu1UFiCis_x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzZyZyocjtkajZUTrt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyyGJ-lF5FZQ7m-da54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}
]