Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would ANYONE let AI drive them? If you don't like driving, take the bus or t…
ytc_UgzTOX_nE…
G
They are trying to force AI onto everyone but kids are an easy target and in so …
ytc_UgwZF60Oh…
G
I have really negative feelings towards AI, call me whiney or whatever but it's …
ytc_UgzcqQYSR…
G
Come work for the United States postal Service everybody! We're still using form…
ytc_Ugyhz81-v…
G
It feels like all the comments are coming from a massive bubble of insecure arti…
ytc_Ugymzbuls…
G
What is the text of the agreement please?
EDIT: Can't find any details, but I h…
rdc_cfl022z
G
Gruselig finde ich die Vorstellung wenn Menschen, erst vielleicht nur zum Spass,…
ytc_UgwgYEvkJ…
G
I don't trust anyone who has anything to gain from AI to tell me how its going t…
ytc_Ugwwb8dJs…
Comment
Automation *should* be a good thing. It should free us from the repetitive, dangerous, and/or otherwise mundane jobs and allow us to pursue more meaningful activities, both personally and as a society. And people who want to keep doing those automated jobs should be able to, without financial worry. (in other words, survival should not be tied directly to employment, but that's a society issue, not tech.) I say should because that is clearly not was it happening. Automation is replacing us yes, but those other pursuits are not available.
I'm with Steve here. The tech is not the enemy. It's the people who implement and weaponize this tech against workers that are the enemy. The tech is a tool. How we use it is up to us.
youtube
AI Jobs
2025-12-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvBQ8ksUEN_nEyTH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzsFHsmRI8ZEznZBw54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwN96M0weHDNCMK8Yx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxozIdf8zIyndB9lQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZtZQMXjU8UwbVJy14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyFklaU2QzdkxHi6p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyFW0pqMiC45Ue9SXN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhHkvViguxuZlJF054AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-SGdEzY4o66j2MaN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYXILf6tx22aHe6ot4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"}
]