Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a half natural and half acquired anarchist (not the bombthrowing kind), this …
ytc_UgzL6mWLk…
G
I do not like driving near all of the driverless cars in AZ. They follow too clo…
ytc_Ugx1WDWuD…
G
My takeway from this is... kinda flipped. I am not worried about AI. I am worrie…
ytc_UgxYt01Zw…
G
I actually disagree, we might be missing a bigger point. Even if the work is pro…
ytc_Ugwxdogz3…
G
I was expecting you to shit on ai bros not rant about the beauty of art and the …
ytc_UgyepDd14…
G
Well yeah, but it could be that this, as an imperfect detector, skews the genera…
rdc_i6saw8r
G
I love AI, i use it to make art a lot but im smart enough to know that doesn't m…
ytc_UgzlJehfl…
G
AUTOFAC by Philip K Dick. The AI super factory kills off all its potential custo…
rdc_oh37mju
Comment
we looking forward to the future and is based on the thinking of some people in the field, but the subject is terrifying if you think in the long term, because techno-we are inventing it ahead of us, but we remain in trouble.
Can we control them?
Artificial Intelligence is aware that you are worried about it. You are thinking about machines by thinking about them.
So they been voluntary
youtube
AI Moral Status
2018-12-01T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwEKdBGAxGpH3LBKMt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyoaq3WC3YNDSHbX214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAWXMJituOFUBdpwh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwv2M0eTr-NoxVejEB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwynKgc1AgSKPp7_WJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzrmUfkFk1cnDmzrcV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXx2ra8KtMC_hYTWF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyscPRWbBZzb2ZtEIJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxg7NzZqNxMfj9_Nct4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy7_gTWaCtyAngWp8x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]