Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Mr Josh Hawley, say FB have some problems, it’s true??? Just ask 😌 ever FAN 😁…
ytc_Ugw51uplb…
G
Logitech didn't build AI into their mouse... they suckered investors and AI Supe…
ytc_UgykyEhYh…
G
Wait… he was planning on knocking the robot made of metal out? The machine? SMH …
ytc_UgzNewQkk…
G
There was a quote from an old cartoon I used to watch occasionally. Paraphrasing…
ytr_Ugz7WKQi8…
G
The irony of AI art bros is that they tell us artists to shut up and deal with i…
ytc_Ugzs-kNTw…
G
Not taking advantage of Earth's resources means losing time. Losing time means l…
ytr_Ugzp0-sHz…
G
Coding doesn't matter. You have to be strong with logic building. Rest give the …
ytc_UgzTg-sOD…
G
Many of those are a good start. Some of them aren’t as realistic, though. Unless…
rdc_o7vyo3p
Comment
What happens when all the white collared people who are out of jobs, start learning/applying for blue collar work and now that field is saturated? AI will be bad for the white and blue collar workers. It will destroy the economy and lead to unrest if it causes mass unemployment.
youtube
AI Governance
2025-08-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8MnH6lBQj5yCeJIt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzj-w_uTNnSCERZB0p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz7XAuglxBOe3j-xfN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFjsWLpFxhdQ5GW3h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwghsx7NXoH1kA7wFt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyshW-sF7xXVtnzOEh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxzr9gwiwfwBxfYX3d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwk7drRFAAv4U5EUmN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwiPt6WWthRKTn3DGt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2JZdm_zmjLFb0POp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]