Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI needs to be stopped a technolgy that is going harm people for benifit of few …
ytc_Ugz2iSq0r…
G
I think it really can't be that hard to take a file of ALL war crimes, upload it…
ytc_Ugz2BDNA6…
G
I hate AI and the ability to shut it down from my cell. I going to purchase a …
ytc_UgwwiWLvd…
G
them citing a study tesla did in germany is vile... For context. while autopilot…
ytc_Ugw5tMKtL…
G
You last conclusion is wrong: by saying ‘sorry’ the AI keeps the conversation go…
ytc_UgxmABsxk…
G
AI should be controlled in a similar way to nuclear arms. It should come with th…
ytc_Ugwv3seHZ…
G
If AI is blackmailing the IDF... I'd say let them do their thing, they're doing …
ytc_UgxiqaunV…
G
Okay nevermind, I take back all the bad things I said about AI gooners. Keep on …
rdc_mukb2l7
Comment
@toppsfamilyadventures8884 I doubt these two claims:
"Most animals [...] when they are hungry, they will automatically eat. They don't make choices."
"Computers cannot do resist right now, if their programming says to do something, it will do that thing exactly."
Why do you think these?
Thank you for responding.
youtube
AI Moral Status
2023-08-23T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgyQKVcU_TX3f-C9qC14AaABAg.9teIbM53Jx39tf6jBP5EZL","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwl6zP57ypPqqbh5aN4AaABAg.9teGL19hLzG9tea5zQYSPe","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzjiad5p60UKbWpfCV4AaABAg.9teDxSIXJqN9texR8QbguJ","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugzjiad5p60UKbWpfCV4AaABAg.9teDxSIXJqN9texs6PFs_y","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugzjiad5p60UKbWpfCV4AaABAg.9teDxSIXJqN9teyUjIYgnv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw8pVMDZ8MhE1Gyf8Z4AaABAg.9teD5dfBS3c9tffDFMGDbp","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxhwYtCqF3Hu79kPcJ4AaABAg.9teBfFUsLcQ9teDXtCmbnc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwc1ty0ImoEXOg2exZ4AaABAg.9teB1ZsU1AV9tfsWP2o892","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwc1ty0ImoEXOg2exZ4AaABAg.9teB1ZsU1AV9tkCtQ4yYOZ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw6xOWQvHU9u0Dpcwp4AaABAg.9te8e-LK7sU9tf7T40o_Ub","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}]