Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I once saw an "AI art contest" skit made to defend the use of AI by going "look,…
ytr_UgyF3GQFA…
G
Nothing scary at all its a robot coded by humans to speak like this stop fearing…
ytc_Ugy4i6eE6…
G
The thing is that there's no soul in any art, Ai or not! I know it's scary and p…
ytc_UgzMO24uc…
G
The public and public sector need to understand AI, and regulate it effectively,…
ytc_UgzEtjFPu…
G
If anyone remembers Conway's Game of Life which was software written in 1970 whe…
ytc_UgzUgLam1…
G
As pessoas precisam de dinheiro pra comprar , substituir tudo por ela criaria um…
ytc_UgxSykvtC…
G
When A.I. begins to dumb down on intellectual information and / or conversation…
ytc_UgyJM5HAo…
G
The most important thing in all art is intent. I’m gonna go back to before the d…
ytr_Ugwe2AGJC…
Comment
Sorry guys, you are wrong on this. Soldiers are already conditioned to be killing machines. They are taught to kill the enemy designated by their superiors on command without question. Add to that that humans get tired and distracted and you begin to realise that a properly programmed ai with final authority over whether it fires or not is likely to be far more capable of distinquishing non combatants and friendlies from hostile forces.
That is, so long as that is a concern of the programmer.
youtube
2012-11-24T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FCmr9EFaaoya2Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNYuH_o45JNtq5kbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWO5tHzZYAFti4m7t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6eCeFDqQBTDRzcWN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcvhfuGmb7J6atofB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh8Muyzn7IBVPdVbp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6OKXZ62XgY3zKeyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Q2g2u3ixsQEiK_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9V7TYg0BzPMRq6ut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZPBybJxcQnaHiRxZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]