Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont see a problem by AI taking over jobs. More freetime for us. If you define…
ytc_UgxkKXhXc…
G
Well. Apparently we going to end up with over 8 billions human beyond poorness. …
ytc_UgxVjTrkI…
G
Sure 🙄 media keeps hyping up this fraudulent narrative that AI is causing all of…
ytc_UgwlPLhEG…
G
plot twist: I’m autistic so even without ai, ai detectors flag 90% of my writing…
ytc_Ugw3OG5C4…
G
Robots will become so advanced that they will be able to replace nearly all work…
ytc_UgxvlZ7T5…
G
I'm just glad they don't have bodies yet. Maybe we should consider just... bombi…
ytc_UgxoXXQud…
G
Here's what I find scary:
Imagine that you're a company that sells 'heavy duty b…
ytc_UgyaWU9xZ…
G
AI is like handing a machine gun to a chimp. Best take cover. Or better yet don'…
ytc_UgzJjxpAc…
Comment
It seems to boil down to ai + attendant hardware will be able to out-perform human beings in ways that if we are not careful will result in a world that becomes inimical to actual human thriving. The rest is moot. Our urgent task at this point is to appreciate this and engineer in guard-rails that will prevent this happening. We are not doing this very well. Unfortunately the proximate incentives are all the usual 'unhelpfuls' - wealth, status, power etc.
youtube
AI Governance
2024-11-12T01:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxwDnlEHA7QFwMzrZB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwGPNiP4G115HlCMmB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxgn2QDG4u3GwUCBPh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz431MRgmzceabjLdd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzcbFmhgeHbLrPqRyN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx-xpntgp4QxxIED5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwePVVbMUGmOuwAgch4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyNv7S5t7BOv9eoxYZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnNR89T2lV3e0tf7Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIZrGwu4CUO899WoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]