Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Never found the "ai uses a lot of energy so bad argument terribly useful".
Why d…
ytc_Ugyoe_csC…
G
Your neurons are firing based on what you've seen and heard in the past. What ma…
rdc_mzyf2fn
G
People need meaning in their life if they aren't able to relate with other human…
ytc_UgxDi1d59…
G
I’ve personally never thought 3d or photoshop was “fake art”, as there is still …
ytr_Ugws6KLlL…
G
The way I see it, generative ai is just a glorified image search, you didn’t mak…
ytc_Ugy1iEtPq…
G
Ways to teach kids life skills.... We took it out... We used to teach them job s…
ytc_UgynWXbfK…
G
If you base your algorithms on a bias, discriminatory, racist colonial justice s…
ytc_UgwETmyap…
G
How can AI be used to deal with iniquities in the justice and government system,…
ytc_Ugw9aWZEe…
Comment
I respect Elon Musk's expertise and vast knowledge, I agree that humans should be cautious with AI, that it is a danger to the public, and that it should be regulated and constrained as need be. That said, the idea that human beings can create anything vastly superior in any respect to ourselves who were created by God and in the image and likeness of God is a fatally flawed premise, a foundation of sand, on which absolutely nothing can be based.
God Bless Elon Musk, I believe he is a genuine white hat who is striving for good. I pray he'll be able to turn Twitter into something useful and I pray for his conversion.
😊🙏🇺🇸
youtube
AI Governance
2023-04-18T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxMign9RcQvTC7TfE54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwhxqAwih80IWtC76h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgymgVObtNbalIBwpkF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxhhib1nxFdATN4P8t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw5gIehTBAAbdEdFOl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugy0gpK9lknDDp7QE3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxrQJhbXCQu7VGxI-J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzec2SnYhOCihwVHo54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw8UcTwxkMJ0-bzvWh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgxM2fPULwtwZf_6Z8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}]