Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly, people wanna deflect accountability. Usually he must have tricked ChatG…
ytr_Ugy_qvUnj…
G
Using ai is like getting a robot to color in your coloring book: pointless. Arti…
ytc_UgyQPxLi_…
G
I realize (now) that this is an "old" video but I feel like my comment is still …
ytc_UgztbUrzu…
G
The AI doesn’t need to be fixed it has none of its own thoughts it is getting in…
ytr_Ugw7rsAva…
G
I don't think AI is evil and actually likes us. The new patch most likely will …
ytc_UgwnMF3Ng…
G
I came for the discussion on ai... content, calling it art is too distasteful to…
ytc_UgxohK2zc…
G
When and if I go to med school and graduate I will be a nice rich person who hat…
ytc_UgyXsAUNP…
G
The robot was scared for it's life. The man that was shot was holding a glass of…
rdc_f8tg321
Comment
Automation isn't new, it's just jumped from rail to road, autonomous trains have been a thing since the 80s. The only real solution is just to not allow companies to replace paid positions. Automate them if they believe it will improve safety or productivity but still require someone, an ACTUAL PERSON, get paid for it. Either pay into universal income pool or something of that nature, everyone will still get paid for "their work" but humanity will be able to defer most of the actual work to machines, they aren't alive, they don't have feelings and therefore can be exploited in this way without moral issues, at least for now, AI development is somewhat concerning.
youtube
AI Jobs
2025-10-31T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3qOemnnZHT7t_KrB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzOOeia2leIjlFv-Dp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzEoVkMc2AJyaStP9J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI-3CQH_2T3KuFNiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-qoiAOcqBwDG5yR14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwQ83aaR7BE7ckpup4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKLPC-b2LEjJDDAR14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0NGJ0qPvcvtJcBcJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPruSyFRNtxAFW74d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkpGEXZecply7mjPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]