Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is already plaging the long game with us, we just don't know it yet.…
ytc_UgzIAKUcD…
G
Ur knowledge of computer systems and biology is lacking. Our emotions are intuit…
ytr_UgxyVopkb…
G
Damn, this vid kinda stroked me. I mean, if I don't do the thing I want to do to…
ytc_UgwhkscWZ…
G
Lol, someone tried ai to do his work correctly? I dont think people will be fire…
ytc_UgzK3b-pp…
G
I guess. By the time we pass out and enter the job market, ai would just become …
ytr_UgyzxW-n3…
G
@JS-oh2dp Which argument am I *supposed* to find convincing? The one where they…
ytr_Ugyo7ZoVD…
G
I dont think the shooting part neccesarily was a racist assumption as the AI was…
ytc_UgzwDpkaN…
G
thats the dumbest opinion ever. why would you think anyone would care to hear th…
ytc_UgwROJXh9…
Comment
I disagree on ai not killing jobs. In my 20 years in automation the things is seen in the past 5 scare me. Before automation was there to reduce the people or increase the product per person produced but ai is there to full on replace workers . Not only white collar jobs but labour jobs. I work currently in a ware house that replaced 4 others and 3/4 of the staff was no longer needed and the next one they are opening will have besides maintenance no picking staff at all. My guess is for every 1 job created by ai it will be 500-1000 lost and replaced by ai.
youtube
AI Moral Status
2026-03-26T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyaz_hvwObSMuzbfKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrICcDn7vRFqggLal4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-qXb3EKLypGbONZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDBzgIG8Y4FKcipSR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxg66iSKtXZVYdvded4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzmfez6Hvr3pxCeHPZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxuPj0YUz0zOajdW1h4AaABAg","responsibility":"scientists","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxykeSxrWNdFtiCQ1Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwM6irhMPNGKDsgkp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugws4zQUJ7yGaCwNMtJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]