Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLMs are a tool that make people a bit more productive. Kind of like when googl…
rdc_mxz0ki9
G
Even more shady considering he initially wanted to become an investor. Probably …
ytr_UgyL03_m5…
G
AI would go rogue once it is given access to weapons so whatever we do, we need …
ytc_UgwfUaQHC…
G
@iceman9678 I’m not complaining about AI but I also allows that — is there a way…
ytr_UgwwtHMi7…
G
@GeoRust1 Arguably describing our survival as an unexpected result is an underst…
ytr_Ugyz6Ms8O…
G
doesnt matter, peoples who were there will just embrace the profit and fortune i…
ytc_Ugwg7gHdQ…
G
The guest doesn’t discuss the defects in AI that are making it unappealing for c…
ytc_UgxO_rACB…
G
Senator Sanders, as much as I like most of what you are saying here, you are mak…
ytc_UgxvhNGGN…
Comment
This is going to sound callous
but I honestly think at this point that things *do* need to get worse to finally scare more people into doing more about it.
rather than being scared of losing what they already have, and stuck on either the freeze or fawn response.
I also don't really see a way out of our problems *without* using AI. A lot of people just want it banned, but that isn't gonna save us. Nor does it actually do anything useful to get mad at the average person who uses it. Making your coworker or cousin or whatever your enemy over this doesn't do shit to support worker's rights or improve the environment.
The problem isn't AI in general, it's that corporations want to shove it into too much BS, to convince investors to keep giving them money.
youtube
AI Jobs
2025-08-30T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx8e5iGEihTURYVa5p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz91GGFujziBYuqEih4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFd4Ey_BZvPTxq3414AaABAg","responsibility":"society","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxkmky345tQqfoUnLZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzOJ-pKPfxsmrz7rP94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfRAdYIOV6xcXlXSl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEMGdsPZjAy0DlaC94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgFMmazJvLZPeFLWN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxHFaDmOqj4iBBlejd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVdGZYH4PuZCOotnp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]