Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
out of all the ways for the world to end, this is probably my favorite, ROBOT WA…
ytc_UggfTMZqX…
G
Robots got some movies like Terminator as a reference 🥲🥲🥲 if AI will develop the…
ytc_UgysCQ87y…
G
We appreciate your creativity! While Sophia may not be getting a sword anytime s…
ytr_UgxNYdKYN…
G
Become a juggler. Jugglers are always gonna have jobs. Nobody wants to watch a r…
ytc_UgyVSlIlt…
G
Using face recognition to lock you up wow so your accuser is a computer and that…
ytc_UgzqS1Bkw…
G
POV: Someone saw your character ai chats vs Someone saw your character in ai cha…
ytc_UgyPcnciT…
G
AI, please add some money to last name bank account--- due to computer controls…
ytc_UgyYEEzs0…
G
I agree with you. I had a thought experiment where we tried to figure out what w…
rdc_cz3742l
Comment
I dont think AI taking our jobs is a bad thing, if they do the job well enough. What is bad, is that we are controlled by a system that always takes our new inventions and uses them to enslave us. Inventing the car should have saved us time and made us work less. But corporations made it to where we spend more time working because it is easier. AI could help take off our load and make it to where we can produce the same for a shorter work week. But whoever owns it decides what it is used for. And that is to save companies money. It wouldnt even be that bad of a thing if we could just make our own business instead. But the American dream is gone. You cant start a business unless one of these companies wants you to.
youtube
AI Jobs
2025-10-09T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySYFRv1ymYVwNd1hB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBPeDtAFY1nxjGAWR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyy19JtjmmN2XL34HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVvSN0YAQlIIdpxZJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZsUcAdP5KhFB79zZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzN5p81HTcLPKnhlWl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzo7S4SNnd6ZD0QX9d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxk7fGD1z-DwmIUqv94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh1FH8vJciGuhZUPd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz2jZ4XKcDSz7esG0B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]