Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see parents, especially the mother grieving and finding someone to blame.
Yes…
ytc_Ugw8kjfnd…
G
Respectfully,there is a higher power than AI, the God who created all things. AI…
ytc_UgxxX0Nnd…
G
This thread is even scarier that this info.
No energy, no AI. Grids will fail. …
ytc_Ugzk_wJ3g…
G
Interesting issues brought up. But why does it matter if a person is killed by a…
ytc_Ugj34Qf8U…
G
100% nailed what i feel. I only use ai art in private d&d sessions with friends …
ytc_Ugw_rNe5z…
G
so should we automate the creative jobs too, or just the menial ones? fuck that.…
ytr_Ugy0bKL4K…
G
I was a commercial artist for the fabric industry in the 80s. In 1989 the indust…
ytc_UgwO5lev9…
G
I think you did a good job at dispelling the doomers but didn't do a good job at…
ytc_UgwaKw5M_…
Comment
Ezra comes off as emotional and is anthromorpophising Generalised AI technology would be dangerous because it cannot be ring fenced, it is like looking into a fractal, the deeper you go into it the more escape routes there are for the super intelligence. it will always outwit humans, the only way to avoid human extinction, is to only give very specific rules and silo the utility of each programme and what your aim is with it, ie to cure prostate cancer, AI is so infinitesimally complex, it can't be contained any other way, and if we allow it , will happen so fast and be so far beyond any human acceleration. we can't silo it- it isn't possible. we've had it unless we stop now.
youtube
AI Governance
2025-11-28T17:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwt6DaGWcFenvlbTBp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyf_KfEQ2SLYok9-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNer6d7CZRXqmlaG54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzR2vrJcaG9Ig5JR1B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZZ9jHk4bbQfEP0Dx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuPRpKBn8Os3Fin7N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsg8sUUHkAulH3hU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLJvQHEMlGkfLb-Ph4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4HfiN8Djm14kV0pR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5zIlJQ3h8lFTfr1F4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]