Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not only are they flawed. They can spontaneously combust. The future of self dri…
ytc_Ugy0YrA9U…
G
We are getting f*cked in every hole by companies and government entities promoti…
ytc_UgxkOVdHQ…
G
I kinda disagree with this take, not undervaluing human artist who put blood swe…
ytc_UgxVMlvCz…
G
If AI does take over it'll be because big companies replaced their workforce and…
ytc_Ugy_-So3H…
G
yeah you have literally no clue lmao. the reason you never got good at drawing i…
ytr_UgzUOocoh…
G
Then they will be forced to open an OF since AI will take a lot of comfortable o…
ytc_UgxI4QS6C…
G
Everyone should check Spotify btw. If you have a look at whats being played auto…
ytc_UgxAhwvR9…
G
Why does that robot gotta make that face like it’s confused with its nose scrunc…
ytc_UgytfMdvc…
Comment
The people who are creating these programs don’t have anything to do with God of the Bible therefore the AI’s have no soul (s) : or mind, will, emotions or feelings and discernments. There is no human (ness) about them. Looking at humans as a minority as they say, is a very dangerous worldview…
youtube
AI Governance
2023-06-24T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxNvrUs_UCBOro8Vdd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzr8Z4ODRpjlmXaxTJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHq9tSTIhaLuliRFN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCJ8fXQ8-Dz2NfNop4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxh3ufCAkjW8yMjcKN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwZVcZSXmjlF1YBURp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxtsFGIKehhPrkyynt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmqGwB8ss5DIYNOeV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtM6we43qnX-3-MCV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwkz78vgeUuu_OCcXB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}
]