Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did anybody noticed the huge gap distance The man was standing from the robot …
ytc_UgyVj6PRK…
G
@OnigoroshiZero lmaooo bro is so bad at rage baiting, just use ai to write for y…
ytr_UgxmgnP6P…
G
Pas du tout chatgpt est bien plus humain et empathique que ce qui est dit ils so…
ytc_UgyYMdMdD…
G
Humans have been trying to kill other humans for a very long time. AI is just a …
ytr_Ugz0pQvfI…
G
Q Big tech companies are controlling the development of these AI:
A "There are c…
ytc_UgyCo64s1…
G
BUT if robots replace jobs, people won't have money to buy things, then the robo…
ytc_Ugxb23CH_…
G
Because nobody's going to do anything about it. The government needs Boeing, Ama…
ytr_UgxZmeNu4…
G
You're looking on current llms .... That's really bad assumptions for the future…
ytr_UgzVq_-tn…
Comment
I am absolutely shitting in my pants at the fact that this AGI is evolving and most of humankind will always be reactive and never really BE proactive to any situation of concern. Maybe the tiny amount of real serious preppers out there should include in their bombshelters going back to college to gain a PHD in computer science AI , that might help? I don’t think countries like China and Russia are gonna give a shit about behaving in a safe manner for the world’s best interest!
youtube
AI Governance
2023-03-30T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_3y3ecguJJ139t3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwEd62xYBNpZhy6mcZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNLktpzIyBQy9lhDN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzala4-7odgXsPej-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzdncGJBzRKnxaWEN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgySxkH4hmpB_5HZIpd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZCE2nBTCYoPNjAY54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNbDVMS_Qn-ukZnUp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwlqZvdIjNFrD8uWsV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxjLKIq9nF6j2inTV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}
]