Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Y'all need to stop listening to this complete dumbass. Remember when he 'intervi…
ytc_UgzzWq6LI…
G
> *The end game for all ai is not to make the world a better, easier place*
Cons…
ytr_UgyTktNtr…
G
Bro thw way the girl or the robot looked at her when it was begining sode eye…
ytc_UgwqFs-nZ…
G
speak for yourself yank. luckily the rest of the developed world are filthy lazy…
ytc_UgxxzvyY2…
G
AI is useless. It reminds me of the VR craze in the 80's and early 90's. The onl…
ytc_Ugyr4IdWK…
G
As a teacher this model is completely possible and is great because kids get to …
ytc_UgyfaxMus…
G
Please go to big bang time and stop that. Let us save this earth from this AI. 😮…
ytc_UgxgRrbWu…
G
Of course humans.
People tend to build robot's faces perfect and symmetrical, bu…
ytc_UgxDBMY_J…
Comment
I think what the less informed could be missing in the nuanced conversation about AI is, we do not 100% without a shadow of a doubt understand consciousness. Many, wrongly believe that God or other higher level beings are responsible, note, this is not a scientific position and therefore is called hopium disguised as faith. We should not rely on faith when potentially creating a new higher level life form. We are an organic form of NN created using sophisticated molecules, however, the energy that runs our NN is the same that runs a computerized NN. It is logical to assume that the "potential" is there for an artificial NN to become self aware same as a biological one because WE DO NOT KNOW HOW IT WORKS.
youtube
AI Governance
2023-03-30T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_3y3ecguJJ139t3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwEd62xYBNpZhy6mcZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNLktpzIyBQy9lhDN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzala4-7odgXsPej-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzdncGJBzRKnxaWEN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgySxkH4hmpB_5HZIpd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZCE2nBTCYoPNjAY54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNbDVMS_Qn-ukZnUp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwlqZvdIjNFrD8uWsV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxjLKIq9nF6j2inTV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}
]