Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Been saying this from the first talk of all this AI stuff, AI is not going to go…
ytc_UgxmUFZYw…
G
The robot isn’t saying anything that we don’t already know. It’s not rocket scie…
ytc_Ugw5Eb6p7…
G
I can’t wait to have my robot girlfriend in the passenger seat of my sports car …
ytc_Ugy8rPjam…
G
Most of the time I share similar beliefs with Elon. However, in this case, I don…
ytc_Ugw9gWVsV…
G
You're still giving a roleplay prompt in which it will use math to calculate a s…
ytr_Ugx8O3IIq…
G
The previous tech revolutions automated labor and communication to be more effic…
ytc_UgzHuZL0K…
G
A podcast warning about the terrifying impact of AI, but then advertising an AI …
ytc_Ugy6HE2Tb…
G
@pranavmhetre891agreed but I gave them based on if I’m a Christian or non religi…
ytr_Ugy_PRhSY…
Comment
Standard UK management: chasing the hype to line their own pockets. 🤡
They’re so desperate for that short-term bonus that they’ll chuck long-term staff under the bus for AI that barely works. The stats prove it 17% of UK bosses are already planning AI layoffs, yet 77% of workers say it’s actually made their jobs harder and increased burnout.
Even when it fails, they just ‘AI-wash’ the disaster and frame it as a ‘cost-saving success’ to the shareholders. It’s the same old story sack the experience, break the workflow, and then act surprised when productivity tanks. They’d rather have a broken bot than a loyal employee if it means hitting a quarterly KPI!
youtube
AI Jobs
2026-02-19T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfgAa8Rnq4NGvBUzN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhNQJBrwXADCo8C9N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8uQW5j2ofKcOwDZZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGlY3Qqmf7RJ1mEoF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3SSt_AkZWuStg4yJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbLknEC7kd9zqmjPp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqAj1436q-UR-f10t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZ4wcPNn_KJqEehfB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwrp5V9DFzidj-4YjB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0v_MitKIuj-4dhKF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]