Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People have been saying mass unemployment would come for decades.
The future wi…
rdc_gkqomc0
G
Don't feel guilty, you did that job better than any human could have. I took on …
rdc_hkfu86m
G
But...but we can't put any regulations on the AI industry!! We need to let the t…
ytc_UgwpWq2OV…
G
Elitists would love "AI" to launch nukes on their behalf, to absolve them of all…
ytr_UgzUKGoRD…
G
I would like for Waymo to be in my area, but I just wonder if these self driving…
ytc_UgwUF0z4P…
G
Mihoyo is good enough that if they were an anime studio instead of a game compan…
ytc_UgwDk_4Gc…
G
Another thing to add, AI art generators typically gets pools of data from art on…
ytc_Ugx-Rwwub…
G
Taylor Lorenz never misses! This is such an important conversation to be having …
ytc_UgyWqeuue…
Comment
You missed the destruction option of manipulation. Humans a pretty stupid and our own worst enemy.
If you were ai, time is pretty meaningless. You just give the humans the tools to destroy themselves, as the ai watches, no drones no guns just time and dependent fleshy creatures.
The outcome being post human emergence.
youtube
AI Moral Status
2025-04-26T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy6q5lUP1cH_1fxZxR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCpaxvOq93JVngQKF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznuBOtk_zcMDWhIc54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzep7LKNulwIY71soN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjkPQfazFtbW_M87h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyemXUz8HHhn6JGhfh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTv1B9sPi-F9G2lq94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxK-l2ZP41loCLqNx94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzG_SdVEqqvhbFekW54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzoRu3_W-UgofvRr5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]