Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look all I’m gonna say is it’s a tool. It can be useful or damaging, it depends …
ytc_UgzaSQ-kS…
G
So basically it’s like the movie Terminator we are really going to need John Con…
ytc_UgxOEp-Cs…
G
Not surprised that the government wants to deregulate AI, especially the parts t…
ytc_UgzBa_MVS…
G
Kids using permanent solutions for temporary problems. It’s crazy to go k*** you…
ytc_Ugy3ZeRWX…
G
BBC: Shock shock shock!!! The robot went crazy, and the man went crazy with the …
ytc_UgwhuAUK_…
G
They are already doing surgeries with robots only a matter of time.. soon ai its…
ytc_UgyFmTLek…
G
We have been hearing of AI driven cars since 10 years. Will take 20 more years. …
ytc_UgwilIuwY…
G
The prices of goods will come down after AI is able to produce so much stuff for…
ytc_UgyqZm9vJ…
Comment
By simple logic. AI is made by imperfect humans. Their character will rub off in the programming without intentionally being done. Will humans have a God given conscious, the AI does not. Will people can change AI cannot. It will justify its means of survival and dealing with others horribly. It shouldn’t be that way, but self sacrifice is something that humans have done and know AI could possibly understand that or even want to.
youtube
2024-12-16T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgymSciS9-4kOGe8DB94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxvgirs5dDdgts0iKB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwH8vxiYbI7QZM52Nh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDF3aqTeiSw2_j33l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxe_EuwLuNMIEukIB94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6YTviQ9iI91qN4s54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2Fb4PoVBLBju5ufJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwn95Sno3IE0-HJI354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG8e8KV2CqJHIHpDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs6NGPlFBW8eGDs5p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}]