Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As I've demonstrated before and can easily demonstrate again, modern conversational, artificially intelligent software has great difficulty in navigating logic sequences (inductive as well as deductive reasoning) that it has not ever been exposed to. Common lines of logical reasoning are easily dealt with by pointing to them during conversation. This, then, is what ChatGPT was doing with you, Alex. "This sentence is false" and "I am lying" are quite famous paradox tricks that you managed to get ChatGPT to engage with. Your best trick to confound it (to catch a logic processing algorithm in illogical statements) is to have it try to solve a complex logic question that it has never encountered before. Such as: Six men, Andrews, Blaine, Colter, Doister, Ebert, and Fisher, are the only members eligible for the offices of president, vice-president, and secretary in a certain organization. If Andrews won’t be an officer unless Ebert is president, Blaine won’t serve if he outranks Colter, Blaine won’t serve with Fisher under any conditions, Colter won’t serve with both Ebert and Fisher, Colter won’t serve if Fisher is president or Blaine is secretary, Doister won’t serve with Colter or Ebert unless he outranks them, Ebert won’t be vice-president, Ebert won’t be secretary if Doister is an officer, Ebert won’t serve with Andrews unless Fisher serves, too, Fisher won’t serve unless either he or colter is president, How can the three offices be filled? It would be interesting to see you challenge ChatGPT with this puzzle.
youtube AI Moral Status 2024-08-01T23:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgykR4e9IegK35QZl8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw19IAYO6p3Fp7qsz94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyu9sZ4eF5UPEV_6UN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzsO4h8pqlLhnBOfb54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyRxL9RUaFZsbJLeNx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwDT2zAZvSUjbSWE2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzmYiY6VL_TpeZak994AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwciz_LOjb1e01WmA14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxYFUnIDEZW-pJhHiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxHtMnzcXfBFFJAI914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"})