Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unpopular opinion: It's because they're already working on it and they don't wan…
rdc_dwvp4w7
G
You guys are using AI wrong. Take the AI work and then read it out loud and chan…
ytc_UgwaENVFo…
G
The question I ask, and can't answer, is outside those who control AGI, why will…
ytc_UgzKsm6nm…
G
Artificial Intelligence doesn't need rights. Once we give AI and robots rights t…
ytc_UgzVBDzY2…
G
And it begins....Google not only has a lot of money and irons in the fire (robot…
rdc_czxujl5
G
What's wrong with using AI? You cannot find fault anymore from China, you make u…
ytc_Ugx9-ij6f…
G
I’m disgusted that I ever had an account on Character AI, and I’m so, so happy I…
ytc_UgyF5xO47…
G
regulator structure. hmm. I believe the ai will say. humans are cause of poverty…
ytc_UgxDAcmwj…
Comment
blah blah blah, then when everyone goes into blue collar theres going to be a breakthrough in robotics and they will need more people to help program/develop the robot's sensors alongside blue collar market is oversaturated, and then the opposite happens again too many WC, need more BC workers. Just do what you like, be really fking good at it, be able to market your skills, and be able to learn new things. Thats how you stay employed.
youtube
2025-08-25T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyOr0Ef9puXbvr2HNl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_WgppNpMAvvDIT894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9TItLQZbNr-De_IZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyd_jhSPHFcpWedvLF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzox6ZmbRhvSbDlxXJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVk2En78_AUneogv94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZXROKT3we6SlUc4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXOWHrEsMqCfoy2jF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzL5AbG9Jf6aCLg7L94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyoFUb0J2Yu8VfgDRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]