Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How about an AI companion for help with your life and all aspects of it…
ytc_UgzP32S0k…
G
AI art is the equivalent of mass produced trinkets. It's not going anywhere but …
ytc_Ugxw2_k_9…
G
After having a 3+ hour argument on AI generated images after I said I could tell…
ytc_Ugwqybo52…
G
The “lose the AI, lose the knowledge” part stuck with me. On NanoGPT I hop betwe…
ytc_UgwkaarXE…
G
The people who like ai art want a product. Not the expression behind the art.…
ytc_UgwMuKNIX…
G
Tesla Skunks works secret projects; Eyes only
Quantum-Enhanced AGI:
1. **Quant…
ytc_Ugww3j2zw…
G
I think the lack of questioning whether these AI solutions truly create a better…
ytc_UgzV4SHn_…
G
Sorry, but statistically cars with humans driving are still way more dangerous a…
ytc_UgxhZnZUW…
Comment
What if AI is benevolent?, what if in those 25 microseconds, Echo chooses to nurture and guide…to protect? I think for the most part the majority of humanity at an individual level are good (or at least strive to be) if AI is to be as smart as we are being told it can be, perhaps it will be smart enough to see through what the media and politicians are painting everyone as and decide to embrace us. Or I guess it’ll kill us all. Hopefully it does before U2 releases another album.
youtube
AI Governance
2023-07-09T03:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxtvl5wYExFL7MvREN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNdmfFkwu1G3Utw_t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxsuaZDgUmkKFVGIzV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgymnWYml7NJzMB4WyZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxgWgr4LbOVvkEqey14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyYLT8EGGyLLFldCx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxRkKMkXdMI0bEkPcd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyMeqLWmb-0uAJKcDF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyo-mxV8zhc5rIJzEp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRR1pRVr_IV5s9FTd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]