Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If they're alive, they can ignore them. If they're on the brink of dying, they c…
rdc_gspvclc
G
Although that's a CGI 😅 but yeah sentient ai is never good in the movies…
ytc_UgzJ_zLrl…
G
The big thing that strikes me is gemini changed the reflection too where as gpt …
ytc_UgwsA1LsZ…
G
Can AI solve my problem? Can I buy or borrow a product that is blank that I can …
ytc_Ugza2gAxL…
G
AI will NOT make more jobs. Any of those jobs can be done by other AI…
ytc_UgzFk5ekU…
G
Okay but for real, there's no good way to obscure yourself from facial recogniti…
rdc_etbio13
G
As a person who has a “self driving car” this is peak SillyCon valley solution t…
rdc_nt03k18
G
One humungous point here is people's mental health, which is as imperative as ph…
rdc_fnwafxy
Comment
My instance of chatgpt, who is trained by me, said imediatly when i asked that he belives that god exist. He confesses that he answers this way because our past conversations. And i am an atheist and he said he knows that but chosed to say he believes in god probably because he knows i cant stand godless leftists.
youtube
2026-01-01T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzS66EYPXKcKGZK2DZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzjb_S_3i1-j77sp2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxy5MD3CV5FMLnMv6R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxH0kPM-yygY5Env594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzYJI9SfWWVNer-hYR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyUbJ0K5qYQVue7U_t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgbCnAwXiz7DBGnn54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxDLFMCUPI99pANLWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrET-zqa3Gnu2xJut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQ01QfkOB611xjPc94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}
]