Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@DWS205 “well it’s clear from your question you don’t even understand algorithms and logarithims.” Yes. Algorithms, I’m told, are sets of instructions, but that doesn’t tell me much (I think I even remember the BBC website telling me a cake recipe is actually an algorithm). As for logarithms. I have looked at these before, and did a few basic problems. If I remember correctly they’re like the inverse of powers (exponents), or something like that. “It’s more acting like a calculator that can kind of push its own buttons, and can express it’s regurgitation in plain language.” A calculator doesn’t have all the answers stored to all the sums etc that it will encounter, yeah? Does it instead have a programme or something telling it how to deal with each sum etc? I’ve been hearing chatGPT is ‘probabilistic’. It is trained on vast amounts of data. It then predicts, based on probability, the next word in the sentence, or something like that?
youtube AI Moral Status 2025-05-13T18:3… ♥ 5
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugxkjh_uSaoInylUFGl4AaABAg.AI0l3OT7SgLAIDM2vONtTm","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxkjh_uSaoInylUFGl4AaABAg.AI0l3OT7SgLAIDagKgdSKD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxKHlDHrHvceSKKJV94AaABAg.AHvai8HWIAnAI08sqy5N28","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgweYzH9xHDWlQxIslx4AaABAg.AHnOfHD69I6AI9qKQoR3t4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxzFbwZRF-o2NC1chh4AaABAg.AHlcTzzxMM5AI8YeSc8_35","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxzFbwZRF-o2NC1chh4AaABAg.AHlcTzzxMM5AIBuQ0MpPOK","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytr_UgxzFbwZRF-o2NC1chh4AaABAg.AHlcTzzxMM5AIBx14jibQh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxkqCUJbprf6dqXQW54AaABAg.AHkpg_sTxq7AHvHPcZDnx1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxkqCUJbprf6dqXQW54AaABAg.AHkpg_sTxq7AI4107HB5WT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxkqCUJbprf6dqXQW54AaABAg.AHkpg_sTxq7AI4A63kFI-R","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]