Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLM AI is so bad that I have to question even the application of a simple gramma…
ytr_UgzV-bdGm…
G
I’m sorry but the last robot has human thoughts he was questioning everything po…
ytc_UgxzfMwTN…
G
I agree with everything you said but a major issue rising about how to “spot an …
ytc_UgzjSUIXd…
G
This AI thing has scared me to death tbh, it is already better than me in most c…
ytc_UgxLTgHmx…
G
I'm confused, for 50 years he worked on a field that was meant to replace humans…
ytc_UgwJLcKir…
G
AI will never be better ngl, it lacks soul and personality like all the people d…
ytc_Ugwag4SyH…
G
AI will have it's own limitations. If it can't self-replicate and generate mobil…
ytc_UgzR1qvEQ…
G
Yep consciousness is a gradient that we're all familiar with, it starts at zero …
ytr_UgzOVidFF…
Comment
Ive been a software engineer since 1997. My occupation is cooked. 100% - we will still be engineers, but we will be moved to AI (already happening) and eventually to project level (soon), but then we will be completely out of all work in our field. And at each step we will shed 50% of the job market in "IT". And I aint talking networking. I mean you use a computer to do your job.
youtube
AI Governance
2025-09-06T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDPkf7k_6xuZnorcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy75YnCqWzHt2C55dp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwcYHZPGVBURzyIwWZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxctfGCdqPrA1tKQAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy1I9boVGc9BJmacJx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKTWA80zPWmqaakl14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweZE48SsGTtZy14HN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzp5gSaVYCKXnVR1OZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzZPnEXYqe1v_Wy7d14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPNvZRgAtvzxhY5C14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}
]