Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One comment I've seen recently that infuriates me: "Consumers don't care about t…
ytc_Ugwa2htiU…
G
So your solution is to go to a website that hosts publicly available images will…
ytc_Ugzxj_8fi…
G
If Ai somehow took all jobs there would be no need for money anymore, everyone c…
ytc_Ugwedde_o…
G
Tesla has also said to pay attention when using the self driving ft for years un…
ytc_UgyuGm2tx…
G
Apart from the reality that AI would win hands down in every way, against human …
ytc_Ugy6-n7UQ…
G
Is there any possible way to opt out of these sweeping changes to the way we hav…
ytc_Ugzqkv9U3…
G
Omg, what is this lady saying? Ellon musk started open AI because he was scared …
ytc_Ugxi9Sr1E…
G
There are simply jobs AI couldn't physically replace or wouldn't because it isn'…
ytc_UgxnUY7sa…
Comment
The thing that worries me is not losing jobs, its that we're losing jobs with the "expectation" that AI will replace them. Not because it actually is currently a better alternative. When cars came out and changed the world you could concretely see the vast benefit they had over the alternatives. When you look at AI replacing programmers, they only seem like a metric leap for people who only have a vague understanding of programming and more like a semi useful tool for experience programmers/software engineers. We are assuming something is going to change the world over impressive parlor tricks not tested value, which is fine if we weren't completely doubling down on the technology and throwing trillions into data centers.
youtube
AI Moral Status
2025-07-23T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx3xQJ-fqmwvZIjDrZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTcWf2kvWsDJvdMv94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9W0G8xw5fIHJBmtB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRB5brR5GDOeXEfWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjLaVRQrgevVWTApx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSR60aaERi23xG-aB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2OFqjawwc2mgQ0bl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw93BS_dcGp_vY1YDZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZanRf-bhSHOTzeEx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEsV8t-jAeQf56qK54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})