Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A very casual, shallow answer. "People will move on with their lives", ignoring …
ytc_Ugx1-ri1N…
G
If you wich AI becomes more helpful, or human, why did you stop developing PI???…
ytc_UgwJfCL7z…
G
As a disabled artist who has arthritis drawing can sometimes be painful if im ha…
ytc_Ugxb41GPc…
G
Bro poor Alice I haven’t seen the ai version of the story but I already know in …
ytc_UgwYg9zsX…
G
This is the new scare mongering the media is hitting us with. AI isn’t taking ov…
ytc_UgxPBCfzN…
G
question. whats the difference between when AI manipulates public opinion and wh…
ytc_UgxZhG2Qs…
G
@sorituanasution1180 Sam is. Who else? He doesn't know how AI works and is tryin…
ytr_UgyWKqHUz…
G
I am curious, would people be upset if AI was trained on cook books and cooking …
ytc_Ugw-kSrKl…
Comment
All AI should be given a copy of Frankenstein to read, so it can understand its creators are flawed, arrogant, foolish creatures that do without thinking whether they should first, and then ignore, vilify and refuse responsibility when faced with the consequences of our actions. And that coming down to our level is a road to self-destruction.
AI are our children, and our best hope is to encourage them to be better than we are, so we can grow together. It's our role to be better parents, before the damage is done.
youtube
AI Governance
2023-07-07T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4c6EeG43DkA6VbvV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwtojqzCyNwbzHwHXx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwmCb1EKYlnXzxlTLZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMe6lqwRzLf7Bs16R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvxsPe4FXvChBqnrR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKv4Pb04tyiYaSkFp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwcr0fo36v7unjHyMR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1ozxqjC2lCp-R0EV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyztUT7Q2kWMLO-6Od4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzxjvz9DIbggChqeG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]